AbstractSoapInterceptor is not working with cxf:cxfEndpoint - java

I have tried to log a message from a cxf interceptor which extends AbstractSoapInterceptor and added that interceptor to chain
Blueprint.xml
`<cxf:cxfEndpoint id="reportEndpoint" address="/report/"
serviceClass="com.shajeer.integration.helloworld.incident.IncidentService">
<cxf:inInterceptors>
<bean id="inInterceptor"
class="com.shajeer.integration.helloworld.logging.LoggingInSetupInterceptor" />
</cxf:inInterceptors>
</cxf:cxfEndpoint>`
interceptor
`import org.apache.cxf.binding.soap.SoapMessage;
import org.apache.cxf.binding.soap.interceptor.AbstractSoapInterceptor;
import org.apache.cxf.interceptor.Fault;
import org.apache.cxf.phase.Phase;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
public class LoggingInSetupInterceptor extends AbstractSoapInterceptor{
private static final Logger LOGGER = LoggerFactory.getLogger(LoggingInSetupInterceptor.class);
public LoggingInSetupInterceptor() {
super(Phase.PRE_INVOKE);
}
#Override
public void handleMessage(SoapMessage soapMessage) throws Fault {
System.out.println("In LoggingInSetupInterceptor :: LoggingInSetupInterceptor");
LOGGER.info("In LoggingInSetupInterceptor :: LoggingInSetupInterceptor");
}
}`
But the control flow is not even reaching the interceptor and directly going into the camel context. what can be the reason?
cxf namespace declaration
`xmlns:cxf="http://camel.apache.org/schema/blueprint/cxf"`

Please check what is the message version wheteher it is Soap11 or Soap12 Add the below snippet and debug
public void handleMessage(SoapMessage message) throws Fault {
if (message.getVersion() instanceof Soap11) {
Map<String, List<String>> headers = CastUtils.cast((Map)message.get(Message.PROTOCOL_HEADERS));
if (headers != null) {
List<String> sa = headers.get("SOAPAction");
if (sa != null && sa.size() > 0) {
String action = sa.get(0);
if (action.startsWith("\"")) {
action = action.substring(1, action.length() - 1);
}
getAndSetOperation(message, action);
}
}
} else if (message.getVersion() instanceof Soap12) {
}

Related

Can't able to log using Zuul API gateway

This is my ZuulFilter.java
import javax.servlet.http.HttpServletRequest;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.stereotype.Component;
import com.netflix.zuul.ZuulFilter;
import com.netflix.zuul.context.RequestContext;
import com.netflix.zuul.exception.ZuulException;
#Component
public class ZuulLoggingFilter extends ZuulFilter {
private Logger logger = LoggerFactory.getLogger(this.getClass());
#Override
public boolean shouldFilter() {
return false;
}
#Override
public Object run() throws ZuulException {
HttpServletRequest request = RequestContext.getCurrentContext().getRequest();
logger.info("request -->{} request uri -->{}",request,request.getRequestURI());
return null;
}
#Override
public String filterType() {
return "pre";
}
#Override
public int filterOrder() {
return 1;
}
}
I have used slf4j library to log, but i can't see any log in my console after using api gateway.
I have used api gateway using following url
http://localhost:8765/{application-name}/{uri}.
I have got response properly but, It doesn't get logged in console.
Should filter must return true to get run called:
#Override
public boolean shouldFilter() {
return true;
}

kafka streams exception Could not find a public no-argument constructor for org.apache.kafka.common.serialization.Serdes$WrapperSerde

getting the below error stack trace while working with kafka streams
UPDATE: as per #matthias-j-sax, have implemented my own Serdes with default constructor for WrapperSerde but still getting the following exceptions
org.apache.kafka.streams.errors.StreamsException: stream-thread [streams-request-count-4c239508-6abe-4901-bd56-d53987494770-StreamThread-1] Failed to rebalance.
at org.apache.kafka.streams.processor.internals.StreamThread.pollRequests (StreamThread.java:836)
at org.apache.kafka.streams.processor.internals.StreamThread.runOnce (StreamThread.java:784)
at org.apache.kafka.streams.processor.internals.StreamThread.runLoop (StreamThread.java:750)
at org.apache.kafka.streams.processor.internals.StreamThread.run (StreamThread.java:720)
Caused by: org.apache.kafka.streams.errors.StreamsException: Failed to configure value serde class myapps.serializer.Serdes$WrapperSerde
at org.apache.kafka.streams.StreamsConfig.defaultValueSerde (StreamsConfig.java:972)
at org.apache.kafka.streams.processor.internals.AbstractProcessorContext.<init> (AbstractProcessorContext.java:59)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.<init> (ProcessorContextImpl.java:42)
at org.apache.kafka.streams.processor.internals.StreamTask.<init> (StreamTask.java:136)
at org.apache.kafka.streams.processor.internals.StreamThread$TaskCreator.createTask (StreamThread.java:405)
at org.apache.kafka.streams.processor.internals.StreamThread$TaskCreator.createTask (StreamThread.java:369)
at org.apache.kafka.streams.processor.internals.StreamThread$AbstractTaskCreator.createTasks (StreamThread.java:354)
at org.apache.kafka.streams.processor.internals.TaskManager.addStreamTasks (TaskManager.java:148)
at org.apache.kafka.streams.processor.internals.TaskManager.createTasks (TaskManager.java:107)
at org.apache.kafka.streams.processor.internals.StreamThread$RebalanceListener.onPartitionsAssigned (StreamThread.java:260)
at org.apache.kafka.clients.consumer.internals.ConsumerCoordinator.onJoinComplete (ConsumerCoordinator.java:259)
at org.apache.kafka.clients.consumer.internals.AbstractCoordinator.joinGroupIfNeeded (AbstractCoordinator.java:367)
at org.apache.kafka.clients.consumer.internals.AbstractCoordinator.ensureActiveGroup (AbstractCoordinator.java:316)
at org.apache.kafka.clients.consumer.internals.ConsumerCoordinator.poll (ConsumerCoordinator.java:290)
at org.apache.kafka.clients.consumer.KafkaConsumer.pollOnce (KafkaConsumer.java:1149)
at org.apache.kafka.clients.consumer.KafkaConsumer.poll (KafkaConsumer.java:1115)
at org.apache.kafka.streams.processor.internals.StreamThread.pollRequests (StreamThread.java:827)
at org.apache.kafka.streams.processor.internals.StreamThread.runOnce (StreamThread.java:784)
at org.apache.kafka.streams.processor.internals.StreamThread.runLoop (StreamThread.java:750)
at org.apache.kafka.streams.processor.internals.StreamThread.run (StreamThread.java:720)
Caused by: java.lang.NullPointerException
at myapps.serializer.Serdes$WrapperSerde.configure (Serdes.java:30)
at org.apache.kafka.streams.StreamsConfig.defaultValueSerde (StreamsConfig.java:968)
at org.apache.kafka.streams.processor.internals.AbstractProcessorContext.<init> (AbstractProcessorContext.java:59)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.<init> (ProcessorContextImpl.java:42)
at org.apache.kafka.streams.processor.internals.StreamTask.<init> (StreamTask.java:136)
at org.apache.kafka.streams.processor.internals.StreamThread$TaskCreator.createTask (StreamThread.java:405)
at org.apache.kafka.streams.processor.internals.StreamThread$TaskCreator.createTask (StreamThread.java:369)
at org.apache.kafka.streams.processor.internals.StreamThread$AbstractTaskCreator.createTasks (StreamThread.java:354)
at org.apache.kafka.streams.processor.internals.TaskManager.addStreamTasks (TaskManager.java:148)
at org.apache.kafka.streams.processor.internals.TaskManager.createTasks (TaskManager.java:107)
at org.apache.kafka.streams.processor.internals.StreamThread$RebalanceListener.onPartitionsAssigned (StreamThread.java:260)
at org.apache.kafka.clients.consumer.internals.ConsumerCoordinator.onJoinComplete (ConsumerCoordinator.java:259)
at org.apache.kafka.clients.consumer.internals.AbstractCoordinator.joinGroupIfNeeded (AbstractCoordinator.java:367)
at org.apache.kafka.clients.consumer.internals.AbstractCoordinator.ensureActiveGroup (AbstractCoordinator.java:316)
at org.apache.kafka.clients.consumer.internals.ConsumerCoordinator.poll (ConsumerCoordinator.java:290)
at org.apache.kafka.clients.consumer.KafkaConsumer.pollOnce (KafkaConsumer.java:1149)
at org.apache.kafka.clients.consumer.KafkaConsumer.poll (KafkaConsumer.java:1115)
at org.apache.kafka.streams.processor.internals.StreamThread.pollRequests (StreamThread.java:827)
at org.apache.kafka.streams.processor.internals.StreamThread.runOnce (StreamThread.java:784)
at org.apache.kafka.streams.processor.internals.StreamThread.runLoop (StreamThread.java:750)
at org.apache.kafka.streams.processor.internals.StreamThread.run (StreamThread.java:720)
Here's my usecase:
I will be getting json responses as input to the stream, I want to count requests whose status codes are not 200. Initially, I went through the documentation of kafka streams in official documentation as well as confluent, then implemented WordCountDemo which is working very fine, then I tried to wrote this code, but getting this exception, I am very new to kafka streams, I went through the stack trace, but couldn't understood the context, hence came here for help!!!
Here's my code
LogCount.java
package myapps;
import java.util.Properties;
import java.util.concurrent.CountDownLatch;
import org.apache.kafka.common.serialization.Serde;
import org.apache.kafka.common.serialization.Serdes;
import org.apache.kafka.streams.KafkaStreams;
import org.apache.kafka.streams.StreamsBuilder;
import org.apache.kafka.streams.StreamsConfig;
import org.apache.kafka.streams.Topology;
import org.apache.kafka.streams.kstream.KStream;
import org.apache.kafka.streams.kstream.Produced;
import myapps.serializer.JsonDeserializer;
import myapps.serializer.JsonSerializer;
import myapps.Request;
public class LogCount {
public static void main(String[] args) {
Properties props = new Properties();
props.put(StreamsConfig.APPLICATION_ID_CONFIG, "streams-request-count");
props.put(StreamsConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
JsonSerializer<Request> requestJsonSerializer = new JsonSerializer<>();
JsonDeserializer<Request> requestJsonDeserializer = new JsonDeserializer<>(Request.class);
Serde<Request> requestSerde = Serdes.serdeFrom(requestJsonSerializer, requestJsonDeserializer);
props.put(StreamsConfig.DEFAULT_KEY_SERDE_CLASS_CONFIG, Serdes.String().getClass().getName());
props.put(StreamsConfig.DEFAULT_VALUE_SERDE_CLASS_CONFIG, requestSerde.getClass().getName());
final StreamsBuilder builder = new StreamsBuilder();
KStream<String, Request> source = builder.stream("streams-requests-input");
source.filter((k, v) -> v.getHttpStatusCode() != 200)
.groupByKey()
.count()
.toStream()
.to("streams-requests-output", Produced.with(Serdes.String(), Serdes.Long()));
final Topology topology = builder.build();
final KafkaStreams streams = new KafkaStreams(topology, props);
final CountDownLatch latch = new CountDownLatch(1);
System.out.println(topology.describe());
// attach shutdown handler to catch control-c
Runtime.getRuntime().addShutdownHook(new Thread("streams-shutdown-hook") {
#Override
public void run() {
streams.close();
latch.countDown();
}
});
try {
streams.cleanUp();
streams.start();
latch.await();
} catch (Throwable e) {
System.exit(1);
}
System.exit(0);
}
}
JsonDeserializer.java
package myapps.serializer;
import com.google.gson.Gson;
import org.apache.kafka.common.serialization.Deserializer;
import java.util.Map;
public class JsonDeserializer<T> implements Deserializer<T> {
private Gson gson = new Gson();
private Class<T> deserializedClass;
public JsonDeserializer(Class<T> deserializedClass) {
this.deserializedClass = deserializedClass;
}
public JsonDeserializer() {
}
#Override
#SuppressWarnings("unchecked")
public void configure(Map<String, ?> map, boolean b) {
if(deserializedClass == null) {
deserializedClass = (Class<T>) map.get("serializedClass");
}
}
#Override
public T deserialize(String s, byte[] bytes) {
if(bytes == null){
return null;
}
return gson.fromJson(new String(bytes),deserializedClass);
}
#Override
public void close() {
}
}
JsonSerializer.java
package myapps.serializer;
import com.google.gson.Gson;
import org.apache.kafka.common.serialization.Serializer;
import java.nio.charset.Charset;
import java.util.Map;
public class JsonSerializer<T> implements Serializer<T> {
private Gson gson = new Gson();
#Override
public void configure(Map<String, ?> map, boolean b) {
}
#Override
public byte[] serialize(String topic, T t) {
return gson.toJson(t).getBytes(Charset.forName("UTF-8"));
}
#Override
public void close() {
}
}
As I mentioned, I will be getting JSON as input, the structure is like this,
{
"RequestID":"1f6b2409",
"Protocol":"http",
"Host":"abc.com",
"Method":"GET",
"HTTPStatusCode":"200",
"User-Agent":"curl%2f7.54.0",
}
The corresponding Request.java file looks like this
package myapps;
public final class Request {
private String requestID;
private String protocol;
private String host;
private String method;
private int httpStatusCode;
private String userAgent;
public String getRequestID() {
return requestID;
}
public void setRequestID(String requestID) {
this.requestID = requestID;
}
public String getProtocol() {
return protocol;
}
public void setProtocol(String protocol) {
this.protocol = protocol;
}
public String getHost() {
return host;
}
public void setHost(String host) {
this.host = host;
}
public String getMethod() {
return method;
}
public void setMethod(String method) {
this.method = method;
}
public int getHttpStatusCode() {
return httpStatusCode;
}
public void setHttpStatusCode(int httpStatusCode) {
this.httpStatusCode = httpStatusCode;
}
public String getUserAgent() {
return userAgent;
}
public void setUserAgent(String userAgent) {
this.userAgent = userAgent;
}
}
EDIT: when I exit from kafka-console-consumer.sh, it's saying Processed a total of 0 messages.
As the error indicate, a class is missing a non-argument default constructor for Serdes$WrapperSerde:
Could not find a public no-argument constructor
The issue is this construct:
Serde<Request> requestSerde = Serdes.serdeFrom(requestJsonSerializer, requestJsonDeserializer);
props.put(StreamsConfig.DEFAULT_VALUE_SERDE_CLASS_CONFIG, requestSerde.getClass().getName());
Serdes.serdeFrom return WrapperSerde that does not have an empty default constructor. Thus, you cannot pass it into the StreamsConfig. You can use Serdes generate like this only if you pass objects into the corresponding API calls (ie, overwrite default Serde for certain operators).
To make it work (ie, to be able to set the Serde in the config), you would need to implement a proper class that implement Serde interface.
The requestSerde.getClass().getName() did not work for me. I needed to provide my own WrapperSerde implementation in an inner class. You probably need to do the same with something like:
public class MySerde extends WrapperSerde<Request> {
public MySerde () {
super(requestJsonSerializer, requestJsonDeserializer);
}
}
Instead of specifying in properties, add the custom serde in streams creation
KStream<String, Request> source = builder.stream("streams-requests-input",Consumed.with(Serdes.String(), requestSerde));

Spring Integration File reading

I am newbie to Spring Integration. I am working on solution, but I am stuck on a specific issue while using inbound file adapter ( FileReadingMessageSource ).
I have to read files from different directories and process them and save the files in different directories. As I understand, the directory name is fixed at the start of the flow.
Can some one help me on changing the directory name for different requests.
I attempted the following. First of all, I am not sure whether it is correct way to about and although it worked for only one directory. I think Poller was waiting for more files and never came back to read another directory.
#SpringBootApplication
#EnableIntegration
#IntegrationComponentScan
public class SiSampleFileProcessor {
#Autowired
MyFileProcessor myFileProcessor;
#Value("${si.outdir}")
String outDir;
#Autowired
Environment env;
public static void main(String[] args) throws IOException {
ConfigurableApplicationContext ctx = new SpringApplication(SiSampleFileProcessor.class).run(args);
FileProcessingService gateway = ctx.getBean(FileProcessingService.class);
boolean process = true;
while (process) {
System.out.println("Please enter the input Directory: ");
String inDir = new Scanner(System.in).nextLine();
if ( inDir.isEmpty() || inDir.equals("exit") ) {
process=false;
} else {
System.out.println("Processing... " + inDir);
gateway.processFilesin(inDir);
}
}
ctx.close();
}
#MessagingGateway(defaultRequestChannel="requestChannel")
public interface FileProcessingService {
String processFilesin( String inputDir );
}
#Bean(name = PollerMetadata.DEFAULT_POLLER)
public PollerMetadata poller() {
return Pollers.fixedDelay(1000).get();
}
#Bean
public MessageChannel requestChannel() {
return new DirectChannel();
}
#ServiceActivator(inputChannel = "requestChannel")
#Bean
GenericHandler<String> fileReader() {
return new GenericHandler<String>() {
#Override
public Object handle(String p, Map<String, Object> map) {
FileReadingMessageSource fileSource = new FileReadingMessageSource();
fileSource.setDirectory(new File(p));
Message<File> msg;
while( (msg = fileSource.receive()) != null ) {
fileInChannel().send(msg);
}
return null; // Not sure what to return!
}
};
}
#Bean
public MessageChannel fileInChannel() {
return MessageChannels.queue("fileIn").get();
}
#Bean
public IntegrationFlow fileProcessingFlow() {
return IntegrationFlows.from(fileInChannel())
.handle(myFileProcessor)
.handle(Files.outboundAdapter(new File(outDir)).autoCreateDirectory(true).get())
.get();
}
}
EDIT: Based on Gary's response replaced some methods as
#MessagingGateway(defaultRequestChannel="requestChannel")
public interface FileProcessingService {
boolean processFilesin( String inputDir );
}
#ServiceActivator(inputChannel = "requestChannel")
public boolean fileReader(String inDir) {
FileReadingMessageSource fileSource = new FileReadingMessageSource();
fileSource.setDirectory(new File(inDir));
fileSource.afterPropertiesSet();
fileSource.start();
Message<File> msg;
while ((msg = fileSource.receive()) != null) {
fileInChannel().send(msg);
}
fileSource.stop();
System.out.println("Sent all files in directory: " + inDir);
return true;
}
Now it is working as expected.
You can use this code
FileProcessor.java
import org.springframework.messaging.Message;
import org.springframework.stereotype.Component;
#Component
public class FileProcessor {
private static final String HEADER_FILE_NAME = "file_name";
private static final String MSG = "%s received. Content: %s";
public void process(Message<String> msg) {
String fileName = (String) msg.getHeaders().get(HEADER_FILE_NAME);
String content = msg.getPayload();
//System.out.println(String.format(MSG, fileName, content));
System.out.println(content);
}
}
LastModifiedFileFilter.java
package com.example.demo;
import org.springframework.integration.file.filters.AbstractFileListFilter;
import java.io.File;
import java.util.HashMap;
import java.util.Map;
public class LastModifiedFileFilter extends AbstractFileListFilter<File> {
private final Map<String, Long> files = new HashMap<>();
private final Object monitor = new Object();
#Override
protected boolean accept(File file) {
synchronized (this.monitor) {
Long previousModifiedTime = files.put(file.getName(), file.lastModified());
return previousModifiedTime == null || previousModifiedTime != file.lastModified();
}
}
}
Main Class= DemoApplication.java
package com.example.demo;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.EnableAutoConfiguration;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.boot.autoconfigure.jdbc.DataSourceAutoConfiguration;
import org.springframework.boot.autoconfigure.orm.jpa.HibernateJpaAutoConfiguration;
import java.io.File;
import java.io.IOException;
import java.nio.charset.Charset;
import org.apache.commons.io.FileUtils;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.integration.annotation.Aggregator;
import org.springframework.integration.annotation.InboundChannelAdapter;
import org.springframework.integration.annotation.Poller;
import org.springframework.integration.channel.DirectChannel;
import org.springframework.integration.channel.QueueChannel;
import org.springframework.integration.core.MessageSource;
import org.springframework.integration.dsl.IntegrationFlow;
import org.springframework.integration.dsl.IntegrationFlows;
import org.springframework.integration.dsl.channel.MessageChannels;
import org.springframework.integration.dsl.core.Pollers;
import org.springframework.integration.file.FileReadingMessageSource;
import org.springframework.integration.file.filters.CompositeFileListFilter;
import org.springframework.integration.file.filters.SimplePatternFileListFilter;
import org.springframework.integration.file.transformer.FileToStringTransformer;
import org.springframework.integration.scheduling.PollerMetadata;
import org.springframework.messaging.MessageChannel;
import org.springframework.messaging.PollableChannel;
import org.springframework.stereotype.Component;
#SpringBootApplication
#Configuration
public class DemoApplication {
private static final String DIRECTORY = "E:/usmandata/logs/input/";
public static void main(String[] args) throws IOException, InterruptedException {
SpringApplication.run(DemoApplication.class, args);
}
#Bean
public IntegrationFlow processFileFlow() {
return IntegrationFlows
.from("fileInputChannel")
.transform(fileToStringTransformer())
.handle("fileProcessor", "process").get();
}
#Bean
public MessageChannel fileInputChannel() {
return new DirectChannel();
}
#Bean
#InboundChannelAdapter(value = "fileInputChannel", poller = #Poller(fixedDelay = "1000"))
public MessageSource<File> fileReadingMessageSource() {
CompositeFileListFilter<File> filters =new CompositeFileListFilter<>();
filters.addFilter(new SimplePatternFileListFilter("*.log"));
filters.addFilter(new LastModifiedFileFilter());
FileReadingMessageSource source = new FileReadingMessageSource();
source.setAutoCreateDirectory(true);
source.setDirectory(new File(DIRECTORY));
source.setFilter(filters);
return source;
}
#Bean
public FileToStringTransformer fileToStringTransformer() {
return new FileToStringTransformer();
}
#Bean
public FileProcessor fileProcessor() {
return new FileProcessor();
}
}
The FileReadingMessageSource uses a DirectoryScanner internally; it is normally set up by Spring after the properties are injected. Since you are managing the object outside of Spring, you need to call Spring bean initialization and lifecycle methods afterPropertiesSet() , start() and stop().
Call stop() when the receive returns null.
> return null; // Not sure what to return!
If you return nothing, your calling thread will hang in the gateway waiting for a response. You could change the gateway to return void or, since your gateway is expecting a String, just return some value.
However, your calling code is not looking at the result anyway.
> gateway.processFilesin(inDir);
Also, remove the #Bean from the #ServiceActivator; with that style, the bean type must be MessageHandler.

SpringWebSockets does not send message from server

Hi i have an issue with spring websockets, this is the scenario:
a standalone application is sending (remote) some data like date Date, procedence String, and weight BigDecimal this data is sending via TCP to socket,
after that this data is saving into database at this point all is fine, but in the next step (websocket) i cannot show this information in a webpage, the weight data must be showed (live) in the screen
this is my websocket configuration:
import java.util.List;
import org.springframework.context.annotation.Configuration;
import org.springframework.messaging.converter.MessageConverter;
import org.springframework.messaging.handler.invocation.HandlerMethodArgumentResolver;
import org.springframework.messaging.handler.invocation.HandlerMethodReturnValueHandler;
import org.springframework.messaging.simp.config.ChannelRegistration;
import org.springframework.messaging.simp.config.MessageBrokerRegistry;
import org.springframework.web.socket.config.annotation.EnableWebSocketMessageBroker;
import org.springframework.web.socket.config.annotation.StompEndpointRegistry;
import org.springframework.web.socket.config.annotation.WebSocketMessageBrokerConfigurer;
import org.springframework.web.socket.config.annotation.WebSocketTransportRegistration;
#Configuration
#EnableWebSocketMessageBroker
public class WebSocketConfiguration implements WebSocketMessageBrokerConfigurer {
#Override
public void registerStompEndpoints(final StompEndpointRegistry registry) {
registry.addEndpoint("/indicator").withSockJS();
}
#Override
public void configureClientInboundChannel(final ChannelRegistration registration) {
}
#Override
public void configureClientOutboundChannel(final ChannelRegistration registration) {
}
#Override
public void configureMessageBroker(final MessageBrokerRegistry registry) {
}
#Override
public void configureWebSocketTransport(WebSocketTransportRegistration wstr) {
}
#Override
public void addArgumentResolvers(List<HandlerMethodArgumentResolver> list) {
}
#Override
public void addReturnValueHandlers(List<HandlerMethodReturnValueHandler> list) {
}
#Override
public boolean configureMessageConverters(List<MessageConverter> list) {
return Boolean.TRUE;
}
}
this is my another class that receive data form socket and process information and send to websocket:
import com.mcss.mcontrols.helper.ByteHelper;
import com.spc.basweb.Constants;
import com.spc.basweb.transmissor.dto.Transmission;
import org.apache.log4j.Logger;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.ApplicationListener;
import org.springframework.messaging.core.MessageSendingOperations;
import org.springframework.messaging.simp.broker.BrokerAvailabilityEvent;
import com.spc.basweb.service.BroadcastingService;
import com.spc.basweb.service.DataProcessorService;
import java.io.IOException;
import org.springframework.integration.annotation.MessageEndpoint;
import org.springframework.integration.annotation.ServiceActivator;
import org.springframework.integration.annotation.Transformer;
#MessageEndpoint
public class BroadcastingServiceImpl implements BroadcastingService, ApplicationListener<BrokerAvailabilityEvent> {
private static final Logger LOGGER = Logger.getLogger(BroadcastingServiceImpl.class);
private final MessageSendingOperations<String> messagingTemplate;
private String processedData;
#Autowired
DataProcessorService dataProcessorService;
#Autowired
public BroadcastingServiceImpl(final MessageSendingOperations<String> messagingTemplate) {
this.messagingTemplate = messagingTemplate;
}
#Override
public String getProcessedData() {
return processedData;
}
#Override
#ServiceActivator(inputChannel = "broadcaster")
public String broadcast(byte[] bytes) {
try {
Transmission t = (Transmission) ByteHelper.toObject(bytes);
LOGGER.debug(t.getProcedence() + " " + t.getDate() + " " + t.getWeight());
String rm = this.dataProcessorService.processData(t);
this.messagingTemplate.convertAndSend(Constants.END_POINT_READ, this.dataProcessorService.getWeighing().getWeight().toString());
return rm;
} catch (IOException | ClassNotFoundException ex) {
LOGGER.error("Error de transmision de objetos", ex);
}
return DataProcessorService.NOT_OK_RESPONSE;
}
#Override
public void onApplicationEvent(BrokerAvailabilityEvent e) {
LOGGER.debug("Application event");
}
#Transformer(outputChannel = "broadcaster")
public String convert(String response) {
return response;
}
}
in the debbuger i'm getting this information:
30-03-2016 15:07:20 DEBUG SimpleBrokerMessageHandler:277 - Processing MESSAGE destination=/read session=null payload=3003
in another class (Controller) i'm using the same method:
this.messagingTemplate.convertAndSend(Constants.END_POINT_READ, "3500");
and sending "manually" the information an is showing correctly. and i'm getting in debbuger this message:
30-03-2016 15:05:18 DEBUG SimpleBrokerMessageHandler:277 - Processing MESSAGE destination=/read session=dfR45V77 payload=3500
the difference is in session value but i don't know what this session is having null in the process, what am i doing wrong some clarification o help is welcome
First of all I don't see the configureMessageBroker implementation, so it isn't clear how that may work at all...
From other hand if you see such a difference, try to debug the code in the SimpMessagingTemplate.
I only see headerAccessor.setSessionId(sessionId); in the SimpleBrokerMessageHandler when it does
logger.debug("Broadcasting to " + subscriptions.size() + " sessions.");

Mule Properties in a java component

In my Mule project, I have a property file that contains my http address for instance
server.address = http://localhost:8080/test/
and then in my flow i will reference it as ${server.address} , but how do i reference that property in a java component class?
For instance:
public String address = ${server.address}
You can set the property in your java class as a Spring property. For a singleton object, the property would be set when the flow is started.
<flow name="propertyprojectFlow1" doc:name="propertyprojectFlow1">
<http:inbound-endpoint exchange-pattern="request-response" host="localhost" port="8581" path="echoServer" doc:name="HTTP"/>
<component doc:name="Java">
<singleton-object class="MyClass">
<property key="server" value="${server.address}"/>
</singleton-object>
</component>
</flow>
The class would need to have a setter for the property.
import org.apache.log4j.Logger;
import org.mule.api.MuleEventContext;
import org.mule.api.lifecycle.Callable;
public class MyClass implements Callable {
private static Logger logger = Logger.getLogger(MyClass.class);
private String server;
public MyClass() {
}
#Override
public Object onCall(MuleEventContext eventContext) throws Exception {
logger.info("Server is " + server);
return null;
}
public void setServer(String server) {
logger.info("Setting server to " + server);
this.server = server;
}
}
Property name in properties file:
your.property.name.in.properties.file=testing
Java component:
import org.mule.api.MuleEventContext;
import org.mule.api.lifecycle.Callable;
import org.springframework.beans.factory.annotation.Value;
public class PropertyTest implements Callable {
#Value("${your.property.name.in.properties.file}")
private String yourPropertyNameInPropertiesFile;
#Override
public Object onCall(MuleEventContext eventContext) throws Exception {
System.out.println("yourPropertyNameInPropertiesFile"+yourPropertyNameInPropertiesFile);
// TODO Auto-generated method stub
return null;
}
}
You can load that property or any other property, loading the properties file. (although it could create some performance issues, loading the properties file every time the flow is executed).
public Properties loadProperties(String name) {
if(properties == null){
return loadProperties(name, Thread.currentThread().getContextClassLoader());
} else {
return properties;
}
}
And then:
Properties props = loadProperties("application.properties");
String url = props.getProperty("server.address");
Making some changes to #Matt answer
import org.apache.log4j.Logger;
import org.mule.api.MuleEventContext;
import org.mule.api.lifecycle.Callable;
public class MyClass implements Callable {
private static Logger logger = Logger.getLogger(MyClass.class);
private String server;
private Properties properties;
public MyClass() { }
#Override
public Object onCall(MuleEventContext eventContext) throws Exception {
properties = loadProperties("application.properties");
server = properties.getProperty("server.address");
logger.info("Server is " + server);
return null;
}
public Properties loadProperties(String name) {
if(properties == null){
return loadProperties(name, Thread.currentThread().getContextClassLoader());
} else {
return properties;
}
}
}

Categories