I am using Apache Ignite, which isn't really central to the question, but gives background. In that context, I've create a class extending CacheStoreAdapter that has a method with the following signature:
#Override
public void write(Entry<? extends K, ? extends V> cacheEntry) throws CacheWriterException {
I registered that class with Ignite, so that it calls the write() method whenever I give it data to save in its cache.
What I was surprised to find is that, depending on how Ignite is otherwise configured, the following code...
final V cacheObject = cacheEntry.getValue();
LOG.info("cacheObject = " + ToStringBuilder.reflectionToString(cacheObject));
... outputs the following:
cacheObject = org.apache.ignite.internal.binary.BinaryObjectImpl#7c40ffef[ctx=org.apac
That is, the cacheObject taken from an Entry<? extends K, ? extends V> is not an instance of type V!
I've worked around the issue (as I said it only happens depending on how Ignite is otherwise configured), but I am curious how this is even done in Java.
TL;DR Question:
How is is possible to pass a variable to a method that does not conform to the method's signature? Some kind of reflection technique? Is there a common / legitimate use for doing this?
In java, type parameters are optional: they are not carried along with an object instance and only exist on language level.
So you can always cast anything to and then call any methods with type checks erased:
Map<String, Integer> sim = new HashMap<>();
Map<Object, Object> oom = (Map<Object, Object>) (Map) sim;
As for BinaryObjectImpl, Ignite will try to keep objects in serialized state where possible to save on serialization costs. So you should be aware that type parameters of CacheStore are not always the user-facing types.
It is possible that the caller to your implementation of write creates an instance of the raw type Map.Entry. For example:
Entry entry = new Entry() { /* ... */ }
...
cache.write(entry);
Related
I'm using generics to get my code reusable and to utilize dependency injection.
I have two Interfaces: DataParserImplementation and ObjectImplementation. I have classes that implement each: SalesRepbyId implements DataParserImpl (it parses the data into objects and puts those objects into collections). SalesRep implements Objectimpl (It is the object for a specific dataset).
I'm trying to get it so that I can select which kind of Objectimpl I use in my SalesRepbyId class so I can remove the coupling.
I know there is something called reflection that I've been told is the method I need to use. I also have heard about a "Factory Pattern" and a "Properties file" that allows me to do what I want to do. A lot of this is very confusing so please explain it like I'm five.
Here is the code with where it stops working:
EDIT: Revisions based on comments: I want to specify the type of DataObject (D) my class uses by passing it through the constructor via a common interface and using generic types. When I try and use it instead of a concrete implementing class, I get the error. I can't find anything about this error.
public class SalesRepbyId<D extends ObjectImplementation> implements DataParserImplementation<Map<String,D>> {
private FileParserImplementation<ArrayList<String[]>> FileParser;
private D dataObject;
public SalesRepbyId(FileParserImplementation<ArrayList<String[]>> FileParser,D d){
this.FileParser = FileParser;
this.dataObject = d;
}
#Override
public Map<String, D> Parse() {
try{
//reads the file and returns an array of string arrays
ArrayList<String[]> Salesrep_contactlist = FileParser.ReadFile;
//here it still says "Unknown Class." that's the problem
Map<String, dataObject> SalesrepByIdMap = new HashMap<>();
//I want to be able to put in any class that implements
//dataObject into this class and have it run the same way.
Summary of what I did
I Implemented the Factory Design pattern and created a properties file which allowed me to reflect in the class I wanted instead of trying to use a generic DataObject (or D) type.
Details of Solution
Reflecting the class using the properties file "config.properties" and then casting it to type Objectimplementation allowed me to use any class that implemented that interface (and was implemented in the Factory and set in the properties file). I then refactored all instances of D to type ObjectImplementation since the parent interface is the layer of abstraction needed here rather than a generic concrete class.
Why it didn't work the way I tried it in the question
the reason the generic D type doesn't work with reflection is because reflection uses a concrete classtype determined at runtime and the generic D type is specified before runtime. Thus I was trying to reflect in the classtype and its methods/instances without properly using reflection and the code was telling me that the classtype was unknown at the time I needed it.
Code example to compare to the Question code
Example of the working code:
public class SalesRepbyId implements
DataParserImplementation<Map<String,ObjectImplementation>> {
private FileParserImplementation<ArrayList<String[]>> FileParser;
//the Factory class that creates instances of the reflected class I wanted
private ObjectFactory Factory = new ObjectFactory();
public Map<String, ObjectImplementation> Parse() {
//the proeprties object which then loads properties from a file and reflects the classtype I want
Properties prop = new Properties();
//loading in the classtype and casting it to the subclass of ObjectImplementation that it actually is
prop.load(SalesRepbyId.class.getResourceAsStream("config.properties"));
Class<? extends ObjectImplementation> Classtouse = Class.forName(prop.getProperty("ObjectImplementation")).asSubclass(ObjectImplementation.class);
//construct instances of 'Classtouse' and parse the data into these dynamically typed objects
//return the map that holds these objects
}
I get bellow exception from my java codes.
java.lang.ClassCastException: scala.collection.immutable.Map$Map1 cannot be cast to java.util.HashMap
at au.com.vroc.udf.medianUDF.update(medianUDF.java:79)
I am getting error in my spark application when I cast the buffer to HashMap of java.utill. This is my codes:
public void update(MutableAggregationBuffer buffer, Row input) {
if (!input.isNullAt(0)) {
HashMap currentBuffer=(HashMap) buffer.get(0);//getting exception here
//HashMap currentBuffer=new HashMap();
currentBuffer.put(input.getLong(0), input.getDouble(0));
//currentBuffer.add(currentMap);
buffer.update(0, currentBuffer);
}
}
I guess instead of java hashmap I have to use "scala.collection.immutable.Map$Map1" inside my java class. Can I use any tool in "JavaConversions" namespace.
Anyhep would be appreciated!
Simplest approach would likely be to use Scala Converters.
It should look something like this (not tested, but type-checks):
import scala.collections.JavaConverters
java.util.Map currentBuffer = JavaConverters.mapAsJavaMapConverter(buffer.get(0)).asJava();
Please note that it returns type-parameterized map (i.e. java.util.Map<K, V>), not the non-parameterized java.util.HashMapin your example - you might want to alter the rest of your code to work on the parameterized maps for better type safety.
You get java.util.Map you should use getJavaMap method:
java.util.Map<T, U> currentBuffer = (java.util.Map<T, U>) first.getJavaMap(0)
Note that this is not HashMap - initialized value is Encoded on update and decoded on get. To modify it, you have to make a copy.
I have the following stream to select objects that match a certain criteria:
protected final Map<String, PropertyMapping> propertyMappings = new LinkedHashMap();
public List<PropertyMapping> getPropertyMappingsByAnnotation(final Class annotation) {
return propertyMappings.values()
.stream()
.filter(pm -> pm.getAnnotation(annotation) != null)
.collect(Collectors.toList());
}
The filter somehow causes the Stream to lose track of the generic type of the stream which causes the collect statement to fail with the following error:
incompatible types: java.lang.Object cannot be converted to java.util.List<PropertyMapping>
If I change the filter to pm -> true for example the stream works again. What causes this behavior and is there a way to avoid this? It probably has something to do with the 'annotation' class that gets passed in. I tried to pass a final modifier to see if that fixes the problem.
This is the signature of the getAnnotation method:
public final <T extends Annotation> T getAnnotation(Class<T> annotationClass)
The obvious issue I can see is that you are trying to pass a plain Class variable as an argument to a method that expects a Class<T> where <T extends Annotation>. I imagine that the compiler is failing to fully figure that method call out and it's causing the compilation error at the end of the stream chain. If you fix that, your mystery problem might go away.
Something like this:
public <T extends Annotation> List<PropertyMapping>
getPropertyMappingsByAnnotation(Class<T> annotation) {
Hmm, does collect here know what kind of list it needs to produce (ArrayList/LinkedList)? Try something like this:
List<PropertyMapping> result = propertyMappings.values().stream()
.filter(pm -> pm.getAnnotation(annotation) != null)
.collect(Collectors.toCollection(LinkedList::new));
If this doesn't work then maybe try adding a wildcard to the Class argument, or even a type.
IntelliJ idea 2016
Is there a way to figure out what generic type is deduced in intellij? I working with kind of larger stream pipeline and sometimes I came across with such misleading error message like This method cannot be used in a static context when using a grouping collector like this:
public interface MyInterface<T extends Number>{
String str();
}
public final class MyInterfaceUtils{
public static <T extends Number> Collection<MyInterface<T>> myInterfaces(MyInterface<T> mi){
//impl goes here
}
}
Now the pipeline itself:
Collection<MyInterface<Integer>> col;
//...
Map<String, Collection<MyInterface<Integer>>> m = col.stream()
.flatMap(mi -> MyInterfaceUtils.muInterfaces(mi).stream())
.collect(groupingBy(MyInterface::str)); //prints error:
//Non-static method cannot be referenced from a static context
But when I replace the declartion with Map<String, List<MyInterface<Integer>>> mit works fine. I understand that generic types are not inherited in this way, but the error message was kind of misleading so I had to spent some time figuring out what's going on.
Maybe there's a way to cope with such error messages in intellij?
The error message is misguiding. What here is wrong is that grouping will return Map<String, List<... but not Map<String, Collection<... so changing Collection to List solves compilation problem.
Regarding how to know type of stream pipeline operation. Usually I put cursor on stream pipeline function and extract variable that gives me variable with type.
I'm having problems with an MBean that takes a Map<String, Object> as a parameter. If I try to execute it via JMX using a proxy object, I get an Exception:
Caused by: javax.management.ReflectionException
at org.jboss.mx.server.AbstractMBeanInvoker.invoke(AbstractMBeanInvoker.java:231)
at org.jboss.mx.server.MBeanServerImpl.invoke(MBeanServerImpl.java:668)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
Caused by: java.lang.IllegalArgumentException: Unable to find operation updateProperties(java.util.HashMap)
It appears that it attempts to use the actual implementation class rather than the interface, and doesn't check if this is a child of the required interface. The same thing happens for extended classes (for example declare HashMap, pass in LinkedHashMap). Does this mean it's impossible to use an interface for such methods? At the moment I'm getting around it by changing the method signature to accept a HashMap, but it seems odd that I wouldn't be able to use interfaces (or extended classes) in my MBeans.
Edit: The proxy object is being created by an in-house utility class called JmxInvocationHandler. The (hopefully) relevant parts of it are as follows:
public class JmxInvocationHandler implements InvocationHandler
{
...
public static <T> T createMBean(final Class<T> iface, SFSTestProperties properties, String mbean, int shHostID)
{
T newProxyInstance = (T) Proxy.newProxyInstance(iface.getClassLoader(), new Class[] { iface }, (InvocationHandler) new JmxInvocationHandler(properties, mbean, shHostID));
return newProxyInstance;
}
...
private JmxInvocationHandler(SFSTestProperties properties, String mbean, int shHostID)
{
this.mbeanName = mbean + MBEAN_SUFFIX + shHostID;
msConfig = new MsConfiguration(properties.getHost(0), properties.getMSAdminPort(), properties.getMSUser(), properties.getMSPassword());
}
...
public Object invoke(Object proxy, Method method, Object[] args) throws Throwable
{
if (management == null)
{
management = ManagementClientStore.getInstance().getManagementClient(msConfig.getHost(),
msConfig.getAdminPort(), msConfig.getUser(), msConfig.getPassword(), false);
}
final Object result = management.methodCall(mbeanName, method.getName(), args == null? new Object[] {} : args);
return result;
}
}
Got it. JMX invocations sometimes make cannon-fodder of the best intended utility classes .... :)
This guy, I suspect, is a problem:
public Object invoke(Object proxy, Method method, Object[] args) throws Throwable
{
if (management == null)
{
management = ManagementClientStore.getInstance().getManagementClient(msConfig.getHost(),
msConfig.getAdminPort(), msConfig.getUser(), msConfig.getPassword(), false);
}
final Object result = management.methodCall(mbeanName, method.getName(), args == null? new Object[] {} : args);
return result;
}
because the MBean's operation signature (which cares not a whit about inheritance) is determined from the classes of the passed arguments. Since you cannot pass an actual concrete object for which getClass() will return java.util.Map, you will never make a match using the direct types of the arguments themselves. (Similar problems occur with primitives for the same reason).
See this blog post starting with the paragraph opening with "One of the tricky parts of making MetaMBean", as it explains this problem (or the problem I think you're having) in a bit more detail, but the invoke method of the MBeanServer[Connection] is:
invoke(ObjectName name, String operationName, Object[] params, String[] signature)
The first 2 and the last arguments are navigational in that they specify exactly which operation amongst all the ops published in the server should be invoked. The best way to sidestep this issue is to avoid having to "guess" the signature and only rely on the ObjectName and the operation name, which in turn can be done by interrogating (and possibly caching) the MBeanInfo and MBeanOperationInfos of the target MBean. The MBeanOperationInfos will provide you the signature so you don't have to guess.
If this is indeed your issue, there's a couple of ways you can address it:
If the MBean's operation names are unique (i.e. there's no overloading) then you can just use the op name to retrieve the MBeanInfo.
If the MBean's operation is overloaded (i.e. there are multiple operations with the same name but different parameters)... but they all have different parameter counts, then you can easilly determine the correct signature by iterating all the matching op names in the MBeanOperationInfos and matching by param count.
If #1 and #2 do not apply.... then it's tricky and I would re-evaluate the method signatures of your MBean's code.
If #1 and #2 do not apply and #3 will not comply, take a look at this class in Gmx called MetaMBean. In the latest revision, it uses Groovy to create a compiled runtime interface using the MBean's MBeanInfo to make inheritance (and autoboxing) work in method invocation. The same method could be implemented in JavaScript (which has the virtue of being built into Java 6+) or several other JVM scripting languages. Alternatively, look at the previous version which attempted to pattern match against known operation signatures (and worked pretty well actually, but since I was working with Groovy anyways......)
I hope this is helpful. If this turns out not to be the root cause, then forget I said anything....