Java MultiMap Not Recognizing Key - java

I'm trying to store multiple values for a key in a data structure so I'm using Guava (Google Collection)'s MultiMap.
Multimap<double[], double[]> destinations = HashMultimap.create();
destinations = ArrayListMultimap.create();
double[] startingPoint = new double[] {1.0, 2.0};
double[] end = new double[] {3.0, 4.0};
destinations.put(startingPoint, end);
System.out.println(destinations.containsKey(startingPoint));
and it returns false.
Note: Key-values are being stored in the multimap as the destinations.size() increases when I put something there.It also does not happen when keys are String instead of double[].
Any idea what the problem is?
Edit: Many thanks to Jon Skeet I now implemented the class:
class Point {
double lat;
double lng;
public boolean equals(Point p) {
if (lat == p.lat && lng == p.lng)
return true;
else
return false;
}
#Override
public int hashCode() {
int hash = 29;
hash = hash*41 + (int)(lat * 100000);
hash = hash*41 + (int)(lng * 100000);
return hash;
}
public Point(double newlat, double newlng) {
lat = newlat;
lng = newlng;
}
}
And now I have a new problem. This is how I'm using it:
Multimap<Point, Point> destinations = HashMultimap.create();
destinations = ArrayListMultimap.create();
Point startingPoint = new Point(1.0, 2.0);
Point end = new Point(3.0, 4.0);
destinations.put(startingPoint, end);
System.out.println( destinations.containsKey(startingPoint) );
System.out.println( destinations.containsKey(new Point(1.0, 2.0)) );
The first one returns true, the second one returns false. It gives me an error if I put #Override before the equals method.Any Idea what the problem is now?
Thanks :)
Edit2: It now behaves exactly as expected when I changed equals to this:
#Override
public boolean equals(Object p) {
if (this == p)
return true;
else if ( !(p instanceof Point) )
return false;
else {
Point that = (Point) p;
return (that.lat == lat) && (that.lng == lng);
}
}
Thanks everyone.

You're using arrays as the hash keys. That's not going to work - Java doesn't override hashCode and equals for arrays. (The Arrays class provides methods to do this, but it's not going to help you here.) Admittedly I'd expect it to work in this specific case, where you're using the exact same reference for both put and containsKey... When I test your code, it prints true. Are you sure you can reproduce it with exactly your code?
For example, while I'd expect it to work for the code you've given, I wouldn't expect this to work:
// Logically equal array, but distinct objects
double[] key = (double[]) startingPoint.clone();
System.out.println(destinations.containsKey(key));
It sounds like you shouldn't really be using double[] here - you should create a Point class which has two double variables, and overrides equals and hashCode.
Additionally, using double values in hash keys is usually a bad idea anyway, due to the nature of binary floating point arithmetic. That's going to be a problem even using the Point idea above... it should be okay if you don't need to actually do any arithmetic (if you're just copying values around) but take great care...

The problem is that you cannot hash "equal" arrays and get the same result each time. For example:
public static void main(String[] args) {
System.out.println(new double[]{1.0, 2.0}.hashCode());
System.out.println(new double[]{1.0, 2.0}.hashCode());
}
will result something like
306344348
1211154977

Related

Make longitude and latitude as Key of HashMap in Java

I have data like this:
23.3445556 72.4535455 0.23434
23.3645556 72.4235455 0.53434
23.3245556 72.4635455 0.21434
23.3645556 72.2535455 0.25434
I want to make HashMap like this:
HashMap<23.34444,72.23455,0.2345566> demo = new HashMap()
Here 23.34444,72.23455 is a key and 0.2345566 is value.
This is because I want to traverse HashMap like this:
if(demo.latitude < 21.45454545 && demo.longitude > 72.3455)
//get the value from hashMap
long lat repn particular pixel on the map,each pixel have same value , i want to get avg value from particular area suppose x y and pixel will be upto 1 million
And i want to know does this is good way since daily it will get millions hit
You could use the Point class to start off.
https://docs.oracle.com/javase/7/docs/api/java/awt/Point.html
int xE6 = x*1e6
int yE6 = y*1e6
new Point(xE6, yE6)
But since this is awt specific and a misuse of the class, you will probably eventually want to create your own.
public final class LatLon {
private double lat;
private double lon;
public LatLon(double lat, double lon) {
this.lat = lat;
this.lon = lon;
}
#Override
public boolean equals(Object o) {
if (this == o) return true;
if (o == null || getClass() != o.getClass()) return false;
LatLon latLon = (LatLon) o;
if (Double.compare(latLon.lat, lat) != 0) return false;
return Double.compare(latLon.lon, lon) == 0;
}
#Override
public int hashCode() {
int result;
long temp;
temp = Double.doubleToLongBits(lat);
result = (int) (temp ^ (temp >>> 32));
temp = Double.doubleToLongBits(lon);
result = 31 * result + (int) (temp ^ (temp >>> 32));
return result;
}
public double getLat() {
return lat;
}
public void setLat(double lat) {
this.lat = lat;
}
public double getLon() {
return lon;
}
public void setLon(double lon) {
this.lon = lon;
}
}
(autogenerated using IntelliJ)
This can be used like
public static void main(String[] args) {
HashMap<LatLon, Double> demo = new HashMap<LatLon, Double>();
demo.put(new LatLon(23.3445556,72.4535455), 0.23434);
demo.put(new LatLon(23.3645556,72.4235455), 0.53434);
demo.put(new LatLon(23.3245556,72.4635455), 0.21434);
demo.put(new LatLon(23.3645556,72.2535455), 0.25434);
System.out.println(demo.get(new LatLon(23.3645556,72.2535455))); //0.25434
}
The problem with using this class as is, is that it uses doubles. You want some sort of precision, given by the decimal location.
doubles have strange math, and can give you accuracy errors so I heartily recommend using a library designed for geo-coordinates.
Especially given
if(demo.latitude<21.45454545 && demo.longitude >72.3455)
This sort of check is best served by some sort of purpose built collection for dealing with bounds checks and co-ordinates if you end up hitting performance problems.
I think you are approaching the problem the wrong way. Using a HashMap will not work correctly with greater than or lesser than comparisons. What would happen if you had 2 latlong keys that matched your comparison? What value do you choose?
I would probably solve your problem like this:
First, create a class that will contain both your "key" values and your "value" value
public class GeoValue {
double lat;
double lon;
double value;
}
Then, add a comparison method to the class
public boolean lessThanLatGreaterThanLon(double lat, double lon) {
return lat < this.lat && lon > this.lon;
}
Add all of those created objects to a Set type collection. If you use a HashSet, make sure that you also override .equals() and .hashCode methods for your GeoValue class.
To find the values you want you can use a filter method (if you're in Java8 or example)
final double lat = 3.5D;
final double lon = 4.5D;
Set<GeoValue> matchingValues = geoValues.stream()
.filter(geo -> geo.lessThanLatGreaterThanLon(lat, lon))
.collect(Collectors.toSet());
And you're ready to go.
If it's a demo you're creating I would suggest creating an enum class with each coordinate you want to showcase as a separate enum object or as a key to the HashMap.
If that doesn't work for you I would create a "Coordinates" class and store the keys there. You would have to override the hashcode and equals method though or it might not behave like you want it to.
Example
public class Coordinates {
double latitude, longitude;
}
...
HashMap<Coordinates, Double> demo = new HashMap<>(); /* Note: An object of Coordinates is the key. So, you first have to make an object of Coordinates class, put the latitude and longitude values and then put in the HashMap as key.*/
HashMap won't be of use for your needs, because it's not meant for range queries, i.e. give me the entry whose key is closest to 12.0, or give me all entries between keys 10.0 and 20.0.
There are special-purpose structures that deal with geo points efficiently, i.e. R-tree or R* tree.
These kind of trees require you to index your data based on a geo-point like structure, usually a latitude/longitude pair, though they also allow to index data based on geo-shapes.
Creating a lat/lon pair object to be used as the key (as suggested in other answers) is only useful if you use a specialized structure that stores and indexes spatial data. Otherwise, having such pair will be pointless, because you won't be able to search points that are near a given location, or points that lie within a given rectangle, etc.
Now, if you don't want to go the R-tree's way and you can live with quite limited spatial queries, you might want to consider using the following structure:
TreeMap<Double, TreeMap<Double, Double>> demo = new TreeMap<>();
This is a TreeMap of TreeMaps, and the idea is to have the latitude as the key of the outer map and the longitude as the key of the inner maps. So you will always have to search first by latitude, then by longitude.
If this is OK for you, you can take advantage of some very useful methods of TreeMap, such as headMap, tailMap and subMap, to name the most relevant ones.
For example, if you want to find all the points within the rectangle determined by its upper left corner [-10.0, -10.0] and its lower right corner [10.0, 10.0], you could do it as follows:
// Get all points with latitude between -10.0 and 10.0
SortedMap<Double, TreeMap<Double, Double>> byLat = demo.subMap(-10.0, 10.0);
// Now print points from byLat submap with longitude between -10.0 and 10.0
byLat.entrySet().stream()
.map(e -> e.getValue().subMap(-10.0, 10.0))
.forEach(System.out::println);
Even for 1 million points, performance will be reasonable, though not the best one, because TreeMap is a general purpose Map implementation based on a Red/Black tree, which has O(log n) time complexity.
On the other hand, if you are willing to install some software, I recommend you use Elasticsearch with Geolocation. It has geo-point and geo-shaped specialized datatypes that will make your life easier. This search engine has excellent performance and scales horizontally up to thousands of nodes, so memory, lookup times, etc won't be a problem.
You can generate a hashcode based on the longitude,latitude and then use that hashcode as key for save your values. That way it will be simpler instead of using them directly or converting them into a point as there is no use of the point at later point of time.
You can also make use of Point2D class available as part of java.awt. You will need to extend it and make a concrete class but it will give you equals/hashcode, etc. all built in. For integer coordinates you could just use the Point class from same library (no need to extend as well)

For which double value is a comparison with itself the fastest?

Just to put my question in context: I have a class that sorts a list in its constructor, based on some calculated score per element. Now I want to extend my code to a version of the class that does not sort the list. The easiest (but obviously not clean, I'm fully aware, but time is pressing and I don't have time to refactor my code at the moment) solution would be to just use a score calculator that assigns the same score to every element.
Which double value should I pick? I was personally thinking +Infinity or -Infinity since I assume these have a special representation, meaning they can be compared fast. Is this a correct assumption? I do not know enough about the low level implementation of java to figure out if I am correct.
In general avoid 0.0, -0.0 and NaN. Any other number would be fine. You may look into Double.compare implementation to see that they are handled specially:
if (d1 < d2)
return -1; // Neither val is NaN, thisVal is smaller
if (d1 > d2)
return 1; // Neither val is NaN, thisVal is larger
// Cannot use doubleToRawLongBits because of possibility of NaNs.
long thisBits = Double.doubleToLongBits(d1);
long anotherBits = Double.doubleToLongBits(d2);
return (thisBits == anotherBits ? 0 : // Values are equal
(thisBits < anotherBits ? -1 : // (-0.0, 0.0) or (!NaN, NaN)
1)); // (0.0, -0.0) or (NaN, !NaN)
However that depends on how your sorting comparator is implemented. If you don't use Double.compare, then probably it doesn't matter.
Note that except these special cases with 0.0/-0.0/NaN double numbers comparison is wired inside the CPU and really fast, thus you are unlikely to get any significant comparison overhead compared to the other code you already have.
No sure how this would fit in but have you considered writing your own?
It just seems a little concerning that you are looking for an object with specific performance characteristics that are unlikely to consistently appear in a general implementation. Even if you find a perfect candidate by experiment or even from source code you could not guarantee the contract.
static class ConstDouble extends Number implements Comparable<Number> {
private final Double d;
private final int intValue;
private final long longValue;
private final float floatValue;
public ConstDouble(Double d) {
this.d = d;
this.intValue = d.intValue();
this.longValue = d.longValue();
this.floatValue = d.floatValue();
}
public ConstDouble(long i) {
this((double) i);
}
// Implement Number
#Override
public int intValue() {
return intValue;
}
#Override
public long longValue() {
return longValue;
}
#Override
public float floatValue() {
return floatValue;
}
#Override
public double doubleValue() {
return d;
}
// Implement Comparable<Number> fast.
#Override
public int compareTo(Number o) {
// Core requirement - comparing with myself will always be fastest.
if (o == this) {
return 0;
}
return Double.compare(d, o.doubleValue());
}
}
// Special constant to use appropriately.
public static final ConstDouble ZERO = new ConstDouble(0);
public void test() {
// Will use ordinary compare.
int d1 = new ConstDouble(0).compareTo(new Double(0));
// Will use fast compare.
int d2 = ZERO.compareTo(new Double(0));
// Guaranteed to return 0 in the shortest time.
int d3 = ZERO.compareTo(ZERO);
}
Obviously you would need to use Comparable<Number> rather than Double in your collections but that may not be a bad thing. You could probably craft a mechanism to ensure that the fast-track compare is always used in preference (depends on your usage).

equals and hashCode: Is Objects.hash method broken?

I am using Java 7, and I have the following class below. I implemented equals and hashCode correctly, but the problem is that equals returns false in the main method below yet hashCode returns the same hash code for both objects. Can I get more sets of eyes to look at this class to see if I'm doing anything wrong here?
UPDATE: I replaced the line on which I call the Objects.hash method with my own hash function: chamorro.hashCode() + english.hashCode() + notes.hashCode(). It returns a different hash code, which is what hashCode is supposed to do when two objects are different. Is the Objects.hash method broken?
Your help will be greatly appreciated!
import org.apache.commons.lang3.StringEscapeUtils;
public class ChamorroEntry {
private String chamorro, english, notes;
public ChamorroEntry(String chamorro, String english, String notes) {
this.chamorro = StringEscapeUtils.unescapeHtml4(chamorro.trim());
this.english = StringEscapeUtils.unescapeHtml4(english.trim());
this.notes = notes.trim();
}
#Override
public boolean equals(Object object) {
if (!(object instanceof ChamorroEntry)) {
return false;
}
if (this == object) {
return true;
}
ChamorroEntry entry = (ChamorroEntry) object;
return chamorro.equals(entry.chamorro) && english.equals(entry.english)
&& notes.equals(entry.notes);
}
#Override
public int hashCode() {
return java.util.Objects.hash(chamorro, english, notes);
}
public static void main(String... args) {
ChamorroEntry entry1 = new ChamorroEntry("Åguigan", "Second island south of Saipan. Åguihan.", "");
ChamorroEntry entry2 = new ChamorroEntry("Åguihan", "Second island south of Saipan. Åguigan.", "");
System.err.println(entry1.equals(entry2)); // returns false
System.err.println(entry1.hashCode() + "\n" + entry2.hashCode()); // returns same hash code!
}
}
Actually, you happened to trigger pure coincidence. :)
Objects.hash happens to be implemented by successively adding the hash code of each given object and then multiplying the result by 31, while String.hashCode does the same with each of its characters. By coincidence, the differences in the "English" strings you used occur at exactly one offset more from the end of the string as the same difference in the "Chamorro" string, so everything cancels out perfectly. Congratulations!
Try with other strings, and you'll probably find that it works as expected. As others have already pointed out, this effect is not actually wrong, strictly speaking, since hash codes may correctly collide even if the objects they represent are unequal. If anything, it might be worthwhile trying to find a more efficient hash, but I hardly think it should be necessary in realistic situations.
There is no requirement that unequal objects must have different hashCodes. Equal objects are expected to have equal hashCodes, but hash collisions are not forbidden. return 1; would be a perfectly legal implementation of hashCode, if not very useful.
There are only 32 bits worth of possible hash codes, and an unbounded number of possible objects, after all :) Collisions will happen sometimes.
HashCode being 32 bit int value, there is always a possibility of collisions(same hash code for two objects), but its rare/coincidental. Your example is one of the such a highly coincidental one. Here is the explanation.
When you call Objects.hash, it internally calls Arrays.hashCode() with logic as below:
public static int hashCode(Object a[]) {
if (a == null)
return 0;
int result = 1;
for (Object element : a)
result = 31 * result + (element == null ? 0 : element.hashCode());
return result;
}
For your 3 param hashCode, it results into below:
31 * (31 * (31 *1 +hashOfString1)+hashOfString2) + hashOfString3
For your first object. Hash value of individual Strings are:
chamorro --> 1140493257
english --> 1698758127
notes --> 0
And for second object:
chamorro --> 1140494218
english --> 1698728336
notes -->0
If you notice, first two values of the hash code in both objects are different.
But when it computes the final hash code as:
int hashCode1 = 31*(31*(31+1140493257) + 1698758127)+0;
int hashCode2 = 31*(31*(31+1140494218) + 1698728336)+0;
Coincidentally it results into same hash code 1919283673 because int is stored in 32 bits.
Verify the theory your self be using the code segment below:
public static void main(String... args) {
ChamorroEntry entry1 = new ChamorroEntry("Åguigan",
"Second island south of Saipan. Åguihan.", "");
ChamorroEntry entry2 = new ChamorroEntry("Åguihan",
"Second island south of Saipan. Åguigan.", "");
System.out.println(entry1.equals(entry2)); // returns false
System.out.println("Åguigan".hashCode());
System.out.println("Åguihan".hashCode());
System.out.println("Second island south of Saipan. Åguihan.".hashCode());
System.out.println("Second island south of Saipan. Åguigan.".hashCode());
System.out.println("".hashCode());
System.out.println("".hashCode());
int hashCode1 = 31*(31*(31+1140493257) + 1698758127)+0;
int hashCode2 = 31*(31*(31+1140494218) + 1698728336)+0;
System.out.println(entry1.hashCode() + "\n" + entry2.hashCode());
System.out.println(getHashCode(
new String[]{entry1.chamorro, entry1.english, entry1.notes})
+ "\n" + getHashCode(
new String[]{entry2.chamorro, entry2.english, entry2.notes}));
System.out.println(hashCode1 + "\n" + hashCode2); // returns same hash code!
}
public static int getHashCode(Object a[]) {
if (a == null)
return 0;
int result = 1;
for (Object element : a)
result = 31 * result + (element == null ? 0 : element.hashCode());
return result;
}
If you use some different string parameters, hope it will result into different hashCode.
it's not necessary for two unequal objects to have different hashes, the important thing is to have the same hash for two equal objects.
I can implement hashCode() like this :
public int hashCode() {
return 5;
}
and it will stay correct (but inefficient).

hashmap custom class key && object saving/loading

Been working on a project for a while now and I've come across a few different complications and solutions that don't seem to pan out together.
final public class place implements Serializable {
private static final long serialVersionUID = -8851896330953573877L;
String world;
Double X;
Double Y;
Double Z;
}
HashMap<place, Long> blockmap = new HashMap<place, Long>(); // does not work
HashMap<Location, Long> blockmap = new HashMap<Location, Long>(); //works
First, my hashmap is a hashmap containing the time an item was placed (or added) to the world. place is a 'class place {}' containing String world, double x, double y, double z; The problem i've had with this, is that it doesn't work with hashmaps. I can store a new hash key using it, but i cant call to get its value. Using Location instead fixes this problem (hashmap) and works flawlessly.
public void SetBlock(Block block) {
Location loc = new Location(null, block.getLocation().getX(),block.getLocation().getY(),block.getLocation().getZ());
//...
Long time = (long) (System.currentTimeMillis() / 60000);
//...
if (blockmap.containsKey(loc)) {
blockmap.remove(loc);
blockmap.put(loc, time);
//System.out.println("MyLeveler: Block Existed, Updated");
} else {
blockmap.put(loc, time);
//System.out.println("MyLeveler: Block added to " + loc.getX() + ", " + loc.getY() + ", " + loc.getZ());
//System.out.println("MyLeveler: total blocks saved: " + blockmap.size());
}
}
This works without error. Now, for the purpose, this data has to be saved and reloaded when the plugin is disabled, and enabled. To do this, i created a new java class file with a save/load feature.
import java.io.FileInputStream;
import java.io.FileOutputStream;
import java.io.ObjectInputStream;
import java.io.ObjectOutputStream;
public class SLAPI {
public static void save(Object obj,String path) throws Exception
{
ObjectOutputStream oos = new ObjectOutputStream(new FileOutputStream(path));
oos.writeObject(obj);
oos.flush();
oos.close();
}
public static Object load(String path) throws Exception
{
ObjectInputStream ois = new ObjectInputStream(new FileInputStream(path));
Object result = ois.readObject();
ois.close();
return result;
}
}
I typically get "notserializable" errors. Using 'implements Serializable' and ois.defaultReadObject() or oos.defaultWriteObject() which checks the serial on the file results in a clean save/load only when the object is EMPTY! When it contains data, i constantly get "java.io.WriteAbortedException: writing aborted; java.io.NotSerializableException"
This is clearly a problem! One of the recommendations here: ArrayList custom class as HashMap key failed to produce any better results. In fact, creating a custom class was my first issue to begin with >.>
So i guess the questions are:
1) What would i have to alter to use the custom class as a key (and work properly)
2) Why doesn't it recognize that i'm setting it as a serializable class/function/java class
3) Why does it work with an empty hashmap, but not with a filled hashmap?
Basically you need to override hashCode() and equals() in place. Presumably Location already overrides these methods.
Those are the methods that HashMap uses to first narrow down the list of candidate keys very quickly (using the hash code) and then check them for equality (by calling equals).
It's not clear what the serializable problem is - my guess is that although place is serializable, Location isn't. If you could post a short but complete problem demonstrating the problem, that would really help. (It would also be a good idea to start following Java naming conventions, and making your fields private...)
EDIT: Here's an example of the Place class with hash code and equality. Note that I've made it immutable for the sake of avoiding the values changing after it's used as a key in a hash map - I don't know offhand how well that works with serialization, but hopefully it's okay:
public final class Place implements Serializable {
private static final long serialVersionUID = -8851896330953573877L;
private final String world;
// Do you definitely want Double here rather than double?
private final Double x;
private final Double y;
private final Double z;
public Place(String world, Double x, Double y, Double z) {
this.world = world;
this.x = x;
this.y = y;
this.z = z;
}
#Override public int hashCode() {
int hash = 17;
hash = hash * 31 + (world == null ? 0 : world.hashCode());
hash = hash * 31 + (x == null ? 0 : x.hashCode());
hash = hash * 31 + (y == null ? 0 : y.hashCode());
hash = hash * 31 + (z == null ? 0 : z.hashCode());
return hash;
}
#Override public boolean equals(Object other) {
if (!(other instanceof Place)) {
return false;
}
Place p = (Place) other;
// Consider using Guava's "Objects" class to make this simpler
return equalsHelper(world, p.world) &&
equalsHelper(x, p.x) &&
equalsHelper(y, p.y) &&
equalsHelper(z, p.z);
}
private static boolean equalsHelper(Object a, Object b) {
if (a == b) {
return true;
}
if (a == null || b == null) {
return false;
}
return a.equals(b);
}
// TODO: Add getters?
}
It's worth noting that this will be comparing Double values for equality, which is almost always a bad idea... but you can't really give a tolerance in something like equals. So long as the values are exactly the same when you come to look them up, it should work fine.

Is there a Java data structure that is effectively an ArrayList with double indicies and built-in interpolation?

I am looking for a pre-built Java data structure with the following characteristics:
It should look something like an ArrayList but should allow indexing via double-precision rather than integers. Note that this means that it's likely that you'll see indicies that don't line up with the original data points (i.e., asking for the value that corresponds to key "1.5"). EDIT: For clarity, based on the comments, I'm not looking to change the ArrayList implementation. I'm looking for a similar interface and developer experience.
As a consequence, the value returned will likely be interpolated. For example, if the key is 1.5, the value returned could be the average of the value at key 1.0 and the value at key 2.0.
The keys will be sorted but the values are not ensured to be monotonically increasing. In fact, there's no assurance that the first derivative of the values will be continuous (making it a poor fit for certain types of splines).
Freely available code only, please.
For clarity, I know how to write such a thing. In fact, we already have an implementation of this and some related data structures in legacy code that I want to replace due to some performance and coding issues.
What I'm trying to avoid is spending a lot of time rolling my own solution when there might already be such a thing in the JDK, Apache Commons or another standard library. Frankly, that's exactly the approach that got this legacy code into the situation that it's in right now....
Is there such a thing out there in a freely available library?
Allowing double values as indices is a pretty large change from what ArrayList does.
The reason for this is that an array or list with double as indices would almost by definition be a sparse array, which means it has no value (or depending on your definition: a fixed, known value) for almost all possible indices and only a finite number of indices have an explicit value set.
There is no prebuilt class in Java SE that supports all that.
Personally I'd implement such a data structure as a skip-list (or similar fast-searching data structure) of (index, value) tuples with appropriate interpolation.
Edit: Actually there's a pretty good match for the back-end storage (i.e. everything except for the interpolation): Simply use a NavigableMap such as a TreeMap to store the mapping from index to value.
With that you can easily use ceilingEntry() and (if necessary) higherEntry() to get the closest value(s) to the index you need and then interpolate from those.
If your current implementation has complexity O(log N) for interpolating a value, the implementation I just made up may be for you:
package so2675929;
import java.util.Arrays;
public abstract class AbstractInterpolator {
private double[] keys;
private double[] values;
private int size;
public AbstractInterpolator(int initialCapacity) {
keys = new double[initialCapacity];
values = new double[initialCapacity];
}
public final void put(double key, double value) {
int index = indexOf(key);
if (index >= 0) {
values[index] = value;
} else {
if (size == keys.length) {
keys = Arrays.copyOf(keys, size + 32);
values = Arrays.copyOf(values, size + 32);
}
int insertionPoint = insertionPointFromIndex(index);
System.arraycopy(keys, insertionPoint, keys, insertionPoint + 1, size - insertionPoint);
System.arraycopy(values, insertionPoint, values, insertionPoint + 1, size - insertionPoint);
keys[insertionPoint] = key;
values[insertionPoint] = value;
size++;
}
}
public final boolean containsKey(double key) {
int index = indexOf(key);
return index >= 0;
}
protected final int indexOf(double key) {
return Arrays.binarySearch(keys, 0, size, key);
}
public final int size() {
return size;
}
protected void ensureValidIndex(int index) {
if (!(0 <= index && index < size))
throw new IndexOutOfBoundsException("index=" + index + ", size=" + size);
}
protected final double getKeyAt(int index) {
ensureValidIndex(index);
return keys[index];
}
protected final double getValueAt(int index) {
ensureValidIndex(index);
return values[index];
}
public abstract double get(double key);
protected static int insertionPointFromIndex(int index) {
return -(1 + index);
}
}
The concrete interpolators will only have to implement the get(double) function.
For example:
package so2675929;
public class LinearInterpolator extends AbstractInterpolator {
public LinearInterpolator(int initialCapacity) {
super(initialCapacity);
}
#Override
public double get(double key) {
final double minKey = getKeyAt(0);
final double maxKey = getKeyAt(size() - 1);
if (!(minKey <= key && key <= maxKey))
throw new IndexOutOfBoundsException("key=" + key + ", min=" + minKey + ", max=" + maxKey);
int index = indexOf(key);
if (index >= 0)
return getValueAt(index);
index = insertionPointFromIndex(index);
double lowerKey = getKeyAt(index - 1);
double lowerValue = getValueAt(index - 1);
double higherKey = getKeyAt(index);
double higherValue = getValueAt(index);
double rate = (higherValue - lowerValue) / (higherKey - lowerKey);
return lowerValue + (key - lowerKey) * rate;
}
}
And, finally, a unit test:
package so2675929;
import static org.junit.Assert.*;
import org.junit.Test;
public class LinearInterpolatorTest {
#Test
public void simple() {
LinearInterpolator interp = new LinearInterpolator(2);
interp.put(0.0, 0.0);
interp.put(1.0, 1.0);
assertEquals(0.0, interp.getValueAt(0), 0.0);
assertEquals(1.0, interp.getValueAt(1), 0.0);
assertEquals(0.0, interp.get(0.0), 0.0);
assertEquals(0.1, interp.get(0.1), 0.0);
assertEquals(0.5, interp.get(0.5), 0.0);
assertEquals(0.9, interp.get(0.9), 0.0);
assertEquals(1.0, interp.get(1.0), 0.0);
interp.put(0.5, 0.0);
assertEquals(0.0, interp.getValueAt(0), 0.0);
assertEquals(0.0, interp.getValueAt(1), 0.0);
assertEquals(1.0, interp.getValueAt(2), 0.0);
assertEquals(0.0, interp.get(0.0), 0.0);
assertEquals(0.0, interp.get(0.1), 0.0);
assertEquals(0.0, interp.get(0.5), 0.0);
assertEquals(0.75, interp.get(0.875), 0.0);
assertEquals(1.0, interp.get(1.0), 0.0);
}
#Test
public void largeKeys() {
LinearInterpolator interp = new LinearInterpolator(10);
interp.put(100.0, 30.0);
interp.put(200.0, 40.0);
assertEquals(30.0, interp.get(100.0), 0.0);
assertEquals(35.0, interp.get(150.0), 0.0);
assertEquals(40.0, interp.get(200.0), 0.0);
try {
interp.get(99.0);
fail();
} catch (IndexOutOfBoundsException e) {
assertEquals("key=99.0, min=100.0, max=200.0", e.getMessage());
}
try {
interp.get(201.0);
fail();
} catch (IndexOutOfBoundsException e) {
assertEquals("key=201.0, min=100.0, max=200.0", e.getMessage());
}
}
private static final int N = 10 * 1000 * 1000;
private double measure(int size) {
LinearInterpolator interp = new LinearInterpolator(size);
for (int i = 0; i < size; i++)
interp.put(i, i);
double max = interp.size() - 1;
double sum = 0.0;
for (int i = 0; i < N; i++)
sum += interp.get(max * i / N);
return sum;
}
#Test
public void speed10() {
assertTrue(measure(10) > 0.0);
}
#Test
public void speed10000() {
assertTrue(measure(10000) > 0.0);
}
#Test
public void speed1000000() {
assertTrue(measure(1000000) > 0.0);
}
}
So the functionality seems to work. I only measured speed in some simple cases, and these suggest that scaling will be better than linear.
Update (2010-10-17T23:45+0200): I made some stupid mistakes in checking the key argument in the LinearInterpolator, and my unit tests didn't catch them. Now I extended the tests and fixed the code accordingly.
In the Apache commons-math library, if you implement the UnivariateRealInterpolator and the return value of its interpolate method which is typed UnivariateRealFunction you'll be most of the way there.
The interpolator interface takes two arrays, x[] and y[]. The returned function has a method, value() that takes an x' and returns the interpolated y'.
Where it fails to provide an ArrayList-like experience is in the ability to add more values to the range and domain as if the List is growing.
Additionally, they look to be in need of some additional interpolation functions. There are only 4 implementations in the library for the stable release. As a commenter pointed out, it seems to be missing 'linear' or something even simpler like nearest neighbor. Maybe that's not really interpolation...
That's a huge change from ArrayList.
Same as Joachim's response above, but I'd probably implement this as a binary tree, and when I didn't find something I was looking for, average the value of the next smallest and largest values, which should be quick to traverse to.
Your description that it should be "like an ArrayList" is misleading, since what you've described is a one dimensional interpolator and has essentially nothing in common with an ArrayList. This is why you're getting suggestions for other data structures which IMO are sending you down the wrong path.
I don't know of any available in Java (and couldn't easily find one one google), but I think you should have a look at GSL - GNU Scientific Library which includes a spline interpolator. It may be a bit heavy for what you're looking for since it's a two dimensional interpolator, but it seems like you should be looking for something like this rather than something like an ArrayList.
If you'd like it to "look like an ArrayList" you can always wrap it in a Java class which has access methods similar to the List interface. You won't be able to actually implement the interface though, since the methods are declared to take integer indices.

Categories