The following issue was encountered while upgrading Spring 3.2 -> 4.1
There is a Metadata hierarchy, like: AMetadata extends Metadata,
BMetadata extends Metadata etc.
There is a Processor hierarchy, like:
abstract Processor<M extends Metadata>,
AProcessor extends Processor<AMetadata>,
BProcessor extends Processor<BMetadata> etc
There is a service containing an injected List of processors, like this:
#Inject
private List<Processor<Metadata>> processors;
While this worked perfectly in Spring 3.2, with Spring 4.1.0 (and 4.0 as well) it fails to inject list members. Going into debug, it was discovered that:
Processor<Metadata>.isAssignableFrom(BProcessor) == false and this causes Processor beans not to be matched as eligible candidates for injection.
A possible hack-looking solution is to declare Processors as follows:
BProcessor<Metadata> extends Processor<BMetadata> - that works, but looks a bit weird. Another option is to use List<Processor<? extends Metadata>>, but this requires some code changes elsewhere to be compilable and causes a lot of type-safety-check warnings in classes which relied on generics.
So the question is, how to handle this case properly? Did anyone encountered something similar?
Autowiring based on generics was one of the new features in Spring4. AFAIK they were ignored in previous versions. More info here: https://spring.io/blog/2013/12/03/spring-framework-4-0-and-java-generics
So I can't think of any other solution than you already pointed out: List<Processor<? extends Metadata>>.
Also see video here Spring Framework on Java 8 https://www.youtube.com/watch?v=-_aWK8T_YMI that explains this. Generic type information was ingored till 4.0 Spring release. Now, they are taking into account.
Related
I'd like to learn if there are some rules / conditions that a Spring component is wrapped (proxied) by CGLIB. For example, take this case:
#Component
public class TestComponent {
}
#Service
//#Transactional(rollbackFor = Throwable.class)
public class ProcessComponent {
#Autowired
private TestComponent testComponent;
public void doSomething(int key) {
// try to debug "testComponent" instance here ...
}
}
If we let it like this and debug the testComponent field inside the method, then we'll see that it's not wrapped by CGLIB.
Now if we uncomment the #Transactional annotation and debug, we'll find that the instance is wrapped: it's of type ProcessComponent$$EnhancerByCGLIB$$14456 or something like that. It's clearly because Spring needs to create a proxy class to handle the transaction support.
But I'm wondering, is there any way that we can detect how and when does this wrapping happen ? For example, some specific locations in Spring's source code to debug into to find more information; or some documentations on the rules of how they decide to create a proxy.
For your information, I need to know about this because I'm facing a situation where some component (not #Transactional, above example is just for demonstrating purpose) in my application suddenly becomes proxied (I found a revision a bit in the past where it is not). The most important issue is that this'll affect such components that also contain public final methods and another issue (also of importance) is that there must have been some unexpected changes in the design / structure of classes. For these kind of issues, of course we must try to find out what happened / who did the change that led to this etc...
One note is that we have just upgraded our application from Spring Boot 2.1.0RELEASE to 2.1.10RELEASE. And checking the code revision by revision up till now is not feasible, because there have been quite a lot of commits.
Any kind of help would be appreciated, thanks in advance.
You could debug into org.springframework.aop.framework.autoproxy.AbstractAdvisorAutoProxyCreator.getAdvicesAndAdvisorsForBean(Class, String, TargetSource).
If any advisor is found, the bean will be proxied.
If you use a #Lookup method injection it will also proxy the component class.
kotlin 1.2.40
spring integration 5.0.4
(was working with 1.2.30 + 4.* spring integration)
try to create integration flow config (several ways):
...
.route { m : MyClass -> m.key }
...
.route ( MyClass::class.java, { m : MyClass -> m.key })
...
//even
.route<MyClass,String>( object:java.lang.functions.Function<MyClass,String> ...)
All those notations compiled but fail at run time while Spring Integration
cannot find method to use as Route.
If config is rewritten in Java - all working.
One thing is that Kotlins lambdas are not of synthetic class while Java's are,
so Spring Integration not sees them as lambdas.
At another hand it worked before and some DSL things as transform and some other still work with Kotlin lambdas.
It can be avoided with explicit method name :
.route ({ m : MyClass -> m.key },"apply")
While it's very ugly.
Does anyone knew solution except "apply" or shifting #Configuration with Sp Int to Java?
WORKING EXAMPLE ON GITHUB https://github.com/comdiv/kotlin-spring-issue/blob/master/src/test/kotlin/comdiv/TryCreateIntegrationFlowTest.kt
I think it works with the transform() because GenericTransformer has only one method to select. It worked previously because the Function interface was copied from Java 8 to Java 7 code base and only with only one apply() method.
So far I see you have a workaround. As a fix I can suggest to modify all the Function-based Java DSL EI-methods to specify apply method explicitly and don't fall to an ambiguity.
Although I think that is really Kotlin issue that it doesn't honor Java...
I have a column in PostgreSQL 9.6 of type "character varying[]" which is essentially an array of strings. I am using Dropwizard with Hibernate to handle the database connection. Normally I just need to provide an annotation to define the data type, however, Hibernate is complaining about the deserialization of a varchar[] type. How do I map this to a List in Java?
I have tried implement my own UserType extended class to handle the (de)serialization with no luck. Any help would be most appreciated.
UPDATE:
I took a look at this link posted by #Waqas and I was able to at least create a type that extends UserType to implement the mapping of varchar[] to String[] in Java.
Some differences in my implementation:
In the nullSafeSet() and nullSafeGet() methods that need to be implemented (#Override), I had to use a (newer?) class called SharedSessionContractImplementor from org.hibernate.engine.spi instead of the (older?) class SessionImplementor.
When I implemented this change and added the #Type annotation to my column mapping (in my entity data class) my runtime was complaining about an #Override that apparently wasn't valid for a certain HibernateBundle class (error below). Even though maven built the jar without any issues and I only have Java 1.8 installed on my machine (OpenSuse). P.S. I am using Dropwizard 1.2 and I took the declaration of the HibernateBundle straight from their documentation. Nevertheless, I deleted the #Override annotation and it works now. Not sure why, or how, but it does.
Error as promised:
INFO [2017-11-08 22:39:06,220] org.eclipse.jetty.util.log: Logging initialized #1137ms to org.eclipse.jetty.util.log.Slf4jLog
INFO [2017-11-08 22:39:06,310] io.dropwizard.server.DefaultServerFactory: Registering jersey handler with root path prefix: /
INFO [2017-11-08 22:39:06,312] io.dropwizard.server.DefaultServerFactory: Registering admin handler with root path prefix: /
java.lang.Error: Unresolved compilation problem:
The method getDataSourceFactory(ApplicationConfiguration) of type new HibernateBundle<ApplicationConfiguration>(){} must override a superclass method
at com.tksoft.food.Application$1.getDataSourceFactory(Application.java:24)
at com.tksoft.food.Application$1.getDataSourceFactory(Application.java:1)
at io.dropwizard.hibernate.HibernateBundle.run(HibernateBundle.java:61)
at io.dropwizard.hibernate.HibernateBundle.run(HibernateBundle.java:15)
at io.dropwizard.setup.Bootstrap.run(Bootstrap.java:200)
at io.dropwizard.cli.EnvironmentCommand.run(EnvironmentCommand.java:42)
at io.dropwizard.cli.ConfiguredCommand.run(ConfiguredCommand.java:85)
at io.dropwizard.cli.Cli.run(Cli.java:75)
at io.dropwizard.Application.run(Application.java:93)
at com.tksoft.food.Application.main(Application.java:30)
Any way, this has left me super confused, but it is working so I am happy :) (for now). I just have to figure out if I can map it to a List instead of an array :/
Currently, we are using spring data JPA with MySql database with DataTabaleRepository which works well with JPA. Now we are moving our data to Spring data elasticserch but DataTabaleRepository is not working with that. Is there any alternative for that or how can I implement a custom repository for that?
spring-data-jpa-datatables does not implement support for ElasticsearchRepository, as you say and use the Specification API which is not implemented by Spring Data for Elasticsearch, so extending it would take some work.
What you need to do is create your own ElasticsearchRepositoryFactoryBean (ie. ElasticsearchDataTablesRepositoryFactoryBean) and your own implementation of AbstractElasticsearchRepository that implements the specifics of spring-data-jpa-datatables just like DataTablesRepositoryImpl. You should also define your own DataTablesRepository (ElasticsearchDataTablesRepository that extends ElasticsearchRepository) with the same methods.
The org.springframework.data.jpa.datatables.mapping classes can be reused, but you'll have to recreate the logic found in SpecificationFactory for elasticsearch using QueryBuilders, which will be the most time consuming part I imagine.
When you're done, you can use the #EnableElasticsearchRepositories just like described by spring-data-jpa-datatables ie.:
#EnableElasticsearchRepositories(repositoryFactoryBeanClass = ElasticsearchDataTablesRepositoryFactoryBean.class))
And extend your repositories with your ElasticsearchDataTablesRepository interface and you're good to go.
For reference you should look at SpecificationFactory and AbstractElasticsearchRepository (the search method) and get familiar with Elasticsearch QueryBuilders.
Background: I asked another question (here: Performance degradation after moving to jersey 2) about jersey performance. I also opened an issue in jersey Jira. Apparently, there is a known performance problem with sub-resources in jersey 2.
My data model based on entity A, with several sub-entities. However, the ID of the sub entities is unique, and the server allows to access them directly. For example,
/server/As/{a_id}/Bs/{b_id} == /server/Bs/{b_id}
We're using sub-resources for that, so the same resource (B) is both a spring component and a member of A resource. All the resources are spring beans, so both Bs are the same instance.
Now I'm trying to workaround this problem by not using sub-resources at all. I found out that #Path does not support multiple paths. Is there any idea how to solve it?
I tried the following, does it make sense? Can you offer alternatives? I'm asking because I'll have to do the same trick many times (for entities C, D, E etc. and maybe additional level of resources)
First, I removed B reference from A resource class, then:
public abstract class AbstractB {
#GET
#Path({b_id})
#Produce(MediaType.APPLICATION_JSON)
public Response getB(#PathParam("b_id") String bId) {
...
}
...
}
#Component
#Path ("As/{a_id}/Bs")
public class B extends AbstractB{/* empty */}
#Component
#Path ("Bs")
public class AsB extends AbstractB{/* empty */}
Edit: (by peeskillet) - Link to issue