Dispatcher-servlet cannot map to websocket requests - java

I'm developing a Java webapp with Spring as the main framework (Spring core, Spring mvc, Spring security, Spring data, Spring websocket are notably used).
Declaring a message-broker in a Spring context like this provides a SimpMessagingTemplate bean to the context :
<websocket:message-broker>
<websocket:stomp-endpoint path="/stomp">
<websocket:sockjs/>
</websocket:stomp-endpoint>
<websocket:simple-broker prefix="/topic,/queue"/>
</websocket:message-broker>
I have to put this tag in my root context (applicationContext.xml), otherwise services declared in that root context cannot send notifications to users via websocket (because they need the SimpMessagingTemplate).
The thing is, if I put this tag in the root context, clients get a 404 when they subscribe to websocket. And if I put the tag in the dispatcher-servlet, then services in the root context cannot send notifications since they would need the SimpMessagingTemplate (but it is only available in the child dispatcher-servlet context).
Is there a way to "bind" the dispatcher-servlet to the broker ? Declaring the bean twice is not a correct solution.
This issue is the same as Spring : how to expose SimpMessagingTemplate bean to root context ? but looking from another angle (declaring websocket in the root context instead of in the dispatcher-servlet)

I found a dirty solution. I don't like it, but given the lack of answers on SO as well as from current and former colleagues, I had to go forward with the project and implemented a dirty fix.
The dirty fix is to Autowire the SimpMessagingTemplate in Controller and Scheduled classes (all scanned by the dispatcher-servlet, where the websocket tag is declared), and to pass the SimpMessagingTemplate as a parameter to service methods (declared in the root context).
This solution is not transparent (the SimpMessagingTemplate should be autowired directly in services ideally) but it definitely fixes the problem.

I've write a bean to do the injection after the servlet application context inited. It will search through the parent application contexts in order to inject the SimpMessageTemplate
Whatever bean that needs the template:
#Autowired(required=false) //required=false so that it won't throw Exception when startup
private SimpMessagingTemplate messagingTemplate;
PostInjectSimpMessageTemplateBean:
Place this bean in the servlet application context (ie. the same xml file that the websocket located)
(Replace "YOUR.PACKAGE.NAME")
public class PostInjectSimpMessageTemplateBean implements ApplicationListener<ContextRefreshedEvent> {
#Override
public void onApplicationEvent(ContextRefreshedEvent event) {
ApplicationContext servletContext = event.getApplicationContext();
ApplicationContext context = servletContext.getParent();
SimpMessagingTemplate template = servletContext.getBean(SimpMessagingTemplate.class);
while(context != null){
for(String beanName : context.getBeanDefinitionNames()){
Object bean = context.getBean(beanName);
Class<?> clazz = bean.getClass();
if(!clazz.getName().startsWith("YOUR.PACKAGE.NAME")) continue;
List<FieldWithAnnotation<Autowired>> fields = ReflectionUtils.findFieldsWithAnnotation(clazz, Autowired.class);
for (FieldWithAnnotation<Autowired> fieldWithAnno : fields) {
Field field = fieldWithAnno.getField();
if(field.getType() == SimpMessagingTemplate.class){
field.setAccessible(true);
try {
field.set(bean, template);
} catch (Exception e) {}
}
}
List<Method> methods = ReflectionUtils.findMethodsWithAnnotation(clazz, Autowired.class);
for (Method method : methods) {
Class<?>[] paramtypes = method.getParameterTypes();
if(paramtypes.length == 1){
if(paramtypes[0] == SimpMessagingTemplate.class){
method.setAccessible(true);
try {
method.invoke(bean, template);
} catch (Exception e) {}
}
}
}
}
context = context.getParent();
}
}
}

Related

Java EE 7 Batch API : produce job scoped CDI Bean

I'm currently working on a Java EE 7 Batch API application, and I would like the lifecycle of one of my CDI Bean be related to the current job.
Actually I would like this bean to have a #JobScoped scope (but it doesn't exist in the API). Also I would like this bean to be injectable in any of my jobs class.
At first, I wanted to create my own #JobScoped scope, with a JobScopedContext, etc. But then I came with the idea that Batch API has the JobContext bean with a unique job id per bean.
So I wonder if I could manage the lifecycle of my job scoped bean with this JobContext.
For example, I would have my bean that I want to be job scoped :
#Alternative
public class JobScopedBean
{
private String m_value;
public String getValue()
{
return m_value;
}
public void setValue(String p_value)
{
m_value = p_value;
}
}
Then I would have the producer of this bean which will return the JobScopedBean associated to the current job (thanks to the JobContext which is unique per job)
public class ProducerJobScopedBean
{
#Inject
private JobContext m_jobContext;// this is the JobContext of Batch API
#Inject
private JobScopedManager m_manager;
#Produces
public JobScopedBean getObjectJobScoped() throws Exception
{
if (null == m_jobContext)
{
throw new Exception("Job Context not active");
}
return m_manager.get(m_jobContext.getExecutionId());
}
}
And the manager which holds the map of my JobScopedBean :
#ApplicationScoped
public class JobScopedManager
{
private final ConcurrentMap<Long, JobScopedBean> mapObjets = new ConcurrentHashMap<Long, JobScopedBean>();
public JobScopedBean get(final long jobId)
{
JobScopedBean returnObject = mapObjets.get(jobId);
if (null == returnObject)
{
final JobScopedBean ajout = new JobScopedBean();
returnObject = mapObjets.putIfAbsent(jobId, ajout);
if (null == returnObject)
{
returnObject = ajout;
}
}
return returnObject;
}
Of course, I will manage the destruction of the JobScopedBean at the end of each job (through a JobListener and a CDI Event).
Can you tell me if I'm wrong with this solution?
It looks correct to me but maybe I'm missing something?
May be there is a better way to handle this?
Thanks.
So it boils down to creating #Dependent scoped beans that are based on a job on creation. Works fine for beans with a lifespan shorter than the job, so for the standard scopes only #Dependent (#Request/#Session/#Converstion might be ok but do not apply here).
It will cause problems for other Scopes, especially #ApplicationScoped/#Singleton. If you inject the JobScopedBean into one of them. You might be (un)lucky to have an active Job when you need them the first time, but the beans will always be attached to that initial job (#Dependent scope beans are not pseudoscoped so will not create proxies to get the contextual instance)
If you want something like that, create a customscope.

Spring not autowiring field in a class with constructor

I've read questions here in stackoverflow such as:
Anyway to #Autowire a bean that requires constructor arguments?
How to #Autowire bean with constructor
I've also read links provided in these questions such as 3.9.3 Fine-tuning annotation-based autowiring with qualifiers but nothing that I tried worked.
Here's my class:
public class UmbrellaRestClient implements UmbrellaClient {
private static final Logger LOGGER = LoggerFactory.getLogger(UmbrellaRestClient.class);
private static final Map<String, String> PARAMETROS_INFRA_UMBRELLA = ApplicationContextProvider.getApplicationContext().getBean(ParametrosInfraComponent.class)
.findByIdParametroLikeAsMap("%UMBRELLA%");
private final HttpConnectionRest conexaoHttp;
#Autowired
#Qualifier
private TemplateLoaderImpl templateLoader;
public UmbrellaRestClient(final String url) {
this.conexaoHttp = new HttpConnectionRest(UmbrellaRestClient.PARAMETROS_INFRA_UMBRELLA.get("UMBRELLA_HOST") + url, "POST", true);
}
/**
* {#inheritDoc}
*/
#Override
public String enviarNfe(final String cnpjFilial, final String idPedido, final BigDecimal valorGNRE, final String arquivoNfe) {
if (StringUtils.isBlank(arquivoNfe)) {
throw new ClientException("Arquivo de NF-e não carregado.");
}
final String usuario = StringUtils.defaultIfBlank(UmbrellaRestClient.PARAMETROS_INFRA_UMBRELLA.get("USUARIO_UMBRELLA"), "WS.INTEGRADOR");
Map<String, String> parametrosTemplate = new HashMap<>(6);
parametrosTemplate.put("usuario", usuario);
parametrosTemplate.put("senha", StringUtils.defaultIfBlank(UmbrellaRestClient.PARAMETROS_INFRA_UMBRELLA.get("SENHA_UMBRELLA"), "WS.INTEGRADOR"));
parametrosTemplate.put("valorGNRE", valorGNRE.toPlainString());
parametrosTemplate.put("idPedido", idPedido);
parametrosTemplate.put("cnpjFilial", cnpjFilial);
parametrosTemplate.put("arquivoNfe", arquivoNfe);
final String xmlRequisicao = ConverterUtils.retornarXMLNormalizado(this.templateLoader.preencherTemplate(TemplateType.ENVIO_XML_NFE, parametrosTemplate));
this.conexaoHttp.setXmlEnvio(xmlRequisicao);
UmbrellaRestClient.LOGGER.info("XML ENVIO #####################: {}", xmlRequisicao);
return this.conexaoHttp.enviarXML();
}
}
The field templateLoader does not get injected. I tested in other classes that have dependency injection and works. I guess this is happening because I have a constructor that depends on a parameter and this parameter is really passed by each class that needs to use it so I cannot use dependency injection to the parameter of the constructor in applicationContext for example.
What should I do to get field injected?
Using Rest APIs with Spring framework needs to be handled differently. Here is brief explanation.
Spring is a framework that maintains the lifecycle of the component beans and is fully responsible from bean creation to their destruction.
REST APIs are also responsible for maintaining the life cycle of the web services they create.
So, Spring and REST container are working independently to manage the components they have created effeciently.
In my recent project what I did to use both technologies, by creating a seperate class which implements Spring's ApplicationContextAware interface, and collect the beans in a HashMap. This resource can be accessed statically from REST contexts.
The weak point about this is we have to use beans.xml file and register the beans and in the class that implements ApplicationContextAware interface getting the beans by name etc.
The easiest way to create a Spring controlled bean is directly through the ApplicationContext:
#Autowired
private ApplicationContext context;
private UmbrellaRestClient getNewUmbrellaRestClient(String url) {
return context.getBean("umbrellaRestClient", new Object[]{url});
}
Basically this is a factory method. For this to work the UmbrellaRestClient must be declared a bean of scope prototype. As all beans that have a non default constructor must be of scope prototype.
In the case where the class is in a package that is component scanned, this will suffice:
#Service
#Scope("prototype")
public class UmbrellaRestClient implements UmbrellaClient {
...

How to refer a Spring bean at run time in a different context

All,
I have a applicationContext and the beans in them are initialized at my application start up - I will call this parent context.
I have another applicationContext (secondary) - this is deployed into my application at run time - the beans in them are manually read and loaded into my parent context, I do this loading all the beans upfront and then register them as singletons to my parent context - a snippet is shown below. This works as expected.
ApplicationContext fileContext = new FileSystemXmlApplicationContext("file:" + fileList.get(i).getPath());
Map<String, List> beansOfType = fileContext.getBeansOfType(List.class, false, false);
String[] beanNames = fileContext.getBeanDefinitionNames();
ConfigurableApplicationContext parentConfContext = (ConfigurableApplicationContext)parentContext;
BeanDefinitionRegistry beanReg = (BeanDefinitionRegistry)parentConfContext.getAutowireCapableBeanFactory();
for (String string : beanNames) {
try {
beanReg.removeBeanDefinition(string);
}
catch (NoSuchBeanDefinitionException e) {
// TODO Auto-generated catch block
}
parentConfContext.getBeanFactory().registerSingleton(string, fileContext.getBean(string));
}
Now I want to refer a bean in my parent context in my secondary application context (I am passing a bean in parent context as a property reference in my secondary context) - but when I do this I get a 'No bean named' exception. which is obvious because the secondary context has no idea of the parent context.
I have tried setting lazy-init="true" to the bean in secondary context - but this does not help - can some one suggest on how to over come this?
regards
D
If you refer some parent bean from the child context, you have to just say that child context something about your parent:
ConfigurableApplicationContext parentConfContext = (ConfigurableApplicationContext)parentContext;
ApplicationContext fileContext =
new FileSystemXmlApplicationContext(new String[] {"file:" + fileList.get(i).getPath()},
parentConfContext);
In Spring the context are independent. If you want to load a bean in two contexts you need a common configuration file/class.
For example, you have a file/class A, for the first context and a file/class B for the second context. Then if you have a bean that need to be in A and B contexts you need a third file/class C with that bean definition. Finally you need import the file/class C into the A and B.
https://spring.io/understanding/application-context

Jersey Endpoint+OSGi Dependency, Keeping Track

I have a Jersey endpoint which uses a custom OSGi Service ExceptionManager Service.
#Path("service")
public class ServiceFacade {
private volatile ExceptionManager exceptionManager;
public ServiceFacade() {
BundleContext bC = FrameworkUtil.getBundle(ServiceFacade.class).getBundleContext();
ServiceReference<ExceptionManager> sR = bC.getServiceReference(ExceptionManager.class);
if (sR != null)
this.exceptionManager = bC.getService(sR);
}
#GET
#Consumes({MediaType.APPLICATION_JSON})
#Produces(MediaType.APPLICATION_JSON)
public Response sayHello() {
try {
if (exceptionManager == null)
return Response.status(Status.SERVICE_UNAVAILABLE).build();
// Do some work...
} catch (Exception e) {
exceptionManager.handle(e);
}
}
}
This Jersey class is added to the Jersey Application as a simple class, that means that every time a user hits this endpoint, a new instance of this class is created to handle the request. As you can see, the class contains a constructor which initializes the ExceptionManager Service. My question is, isn't there a simplified way of retrieving the service without going to BundleContext?
I have seen DependencyManager, but this bundle seems to only add the dependencies to the class (ServiceFacade in this case) during the Activation process, but that dependency resolution is too early this has to be done during run-time, every time an instance is created. Bellow is an approximation with DependencyManager but is not a solution for this:
public class Activator extends DependencyActivatorBase {
#Override
public void init(BundleContext bundleContext, DependencyManager dependencyManager) throws Exception {
dependencyManager.add(createComponent()
.setImplementation(ServiceFacade.class)
.add(createServiceDependency()
.setService(ExceptionManager.class)
.setRequired(true));
}
}
Thanks.-
You can obtain the reference to an OSGi service without accessing to BundleContext by using Declarative Services. A tutorial can be found here.
You can make the endpoint a singleton resource. This way you can let the dependency manager create a single instance and inject services and then add that instance to the Jersey application.
There are a few limitations, like Jersey's field or constructor injection does not work. You also have to be careful about concurrency when using fields of the resource.

Spring #Transactional rollbackFor not working for checked Exception for method called outside proxy object

I am having a problem with the rollback of Hibernate updates in combination with Spring.
I have the following class:
#Service
#Transactional(readOnly = true)
public class DocumentServiceImpl extends AbstractGenericService<Document, IDocumentDao> implements DocumentService {
#Override
#Transactional(readOnly=false, rollbackFor = DocumentServiceException.class)
public void saveDocument(final DocumentForm form, final BindingResult result, final CustomUserContext userContext) throws DocumentServiceException {
Document document = locateDocument(form, userContext);
if (!result.hasErrors()) {
try {
updateDocumentCategories(form, document);
storeDocument(document, form.getDocumentId(), form.getFile());
solrService.addDocument(document);
} catch (IOException e) {
result.reject("error.uploading.file");
throw new DocumentServiceException("Error trying to copy the uploaded file to its final destination", e);
} catch (SolrServerException e) {
result.reject("error.uploading.file.solr");
throw new DocumentServiceException("Solr had an error parsing your uploaded file", e);
}
}
}
#Override
#Transactional(readOnly = false, rollbackFor = IOException.class)
public void storeDocument(Document document, String documentId, CommonsMultipartFile uploadedFile) throws IOException {
getDao().saveOrUpdate(document);
if (StringUtils.isBlank(documentId)) {
File newFile = documentLocator.createFile(document);
uploadedFile.transferTo(newFile);
// Todo: TEST FOR ROLLBACK ON FILE I/O EXCEPTION
throw new IOException("this is a test");
}
}
The interface is not tagged with any #Transactional annotations. The saveDocument() method is called directly from my Controller, so I expect that the #Transactional configuration of that method is used, particularly the rollbackFor parameter. However, when the DocumentServiceException is thrown, nothing is rolled back (ie the getDao().saveOrUpdate(document) is persisted). For testing purposes I added an "throw new IOException" in the storeDocument method. Hope anybody can help me out how to get this working, it would be greatly appreciated.
The #Transactional annotation is properly placed. You don;t have to set it at interface level, because it's not automatically inherited (what if your concrete class implements two interfaces with a conflicting transactional setting).
When you say the method are called directly, I assume you have the interface being #Autowired and not the concrete implementation.
Place a break point in your service method and check if your have a TransactionInterceptor entry in your stack trace. If you don't have it, then your transaction management configuration is wrong and you are not using Spring transaction management at all.
Update from Dennis
One more thing that could help other people perhaps:
I had the tx:annotation-driven in my applicationContext. The applicationContext contained a component-scan for all beans (no filters).
However, the dispatcherServlet context also contained a component-scan for all beans (legacy code, don't shoot the messenger). So basically I had a copy of all my beans because they were scanned in both contexts.
And because the beans created in the dispatcherServlet context did not contain the tx:annotation-driven element, the services beans in the dispatcherServlet context were not transactional.
I had to change the component-scan in the dispatcherServlet context into:
<context:component-scan base-package="your/base/package" use-default-filters="false">
<context:include-filter type="annotation" expression="org.springframework.stereotype.Controller"/>
</context:component-scan>
So, it will only instantiate the controllers in the dispatcher servlet context (and no autowired dependencies, like its services), and the services/daos in the applicationContext.
The services from the applicationContext are then transactionalized.

Categories