If I am using resteasy, I am able to use the resteasy-jackson-provider which handles marshalling my objects to JSON and back for my rest endpoints, e.g:
#GET
#Path("/")
#Produces({MediaType.APPLICATION_JSON})
public MyThing getSingle() {
return new MyThing();
}
This is nice as it means I don't need to specify an encoder for each and every type - Jackson just deals with it.
I'm now learning websockets, and I find I have to provide encoders:
#ServerEndpoint(value = "/websocket", encoders = {MyThingEncoder.class}, decoders = {MyThingDecoder.class})
public class Websocket {
#OnOpen
public void onOpen(Session session) {
session.getBasicRemote().sendObject(new MyThing());
}
}
This is frustrating, as I don't want to have to provide an encoder/decoder for every separate entity type, especially if they are just using Jackson. Right now, my encoder/decoders look like:
public class MyThingEncoder implements Encoder.Text<MyThing> {
private static final ObjectMapper MAPPER = new ObjectMapper();
#Override
public void init(EndpointConfig endpointConfig) {
}
#Override
public void destroy() {
}
#Override
public String encode(MyThing t) throws EncodeException {
try {
return MAPPER.writeValueAsString(t);
} catch (IOException e) {
throw new EncodeException(t, "Could not encode.", e);
}
}
}
I tried using generics by changing the class definition to MyThingEncoder (below), but it threw an exception (also below):
package com.jetnuts.serv.encoders;
import org.codehaus.jackson.map.ObjectMapper;
import org.jboss.resteasy.util.Encode;
import javax.websocket.*;
import java.io.IOException;
public class MyThingEncoder<T> implements Encoder.Text<T> {
private static final ObjectMapper MAPPER = new ObjectMapper();
Class<T> typeOf;
#Override
public void init(EndpointConfig endpointConfig) {
}
#Override
public void destroy() {
}
#Override
public String encode(T t) throws EncodeException {
try {
return MAPPER.writeValueAsString(t);
} catch (IOException e) {
throw new EncodeException(t, "Could not encode.", e);
}
}
}
The exception:
org.eclipse.jetty.websocket.api.InvalidWebSocketException: Invalid type declared for interface javax.websocket.Encoder$Text on class class com.jetnuts.serv.encoders.MyThingEncoder
at org.eclipse.jetty.websocket.jsr356.metadata.EncoderMetadataSet.getEncoderType(EncoderMetadataSet.java:82)
at org.eclipse.jetty.websocket.jsr356.metadata.EncoderMetadataSet.discover(EncoderMetadataSet.java:50)
at org.eclipse.jetty.websocket.jsr356.metadata.CoderMetadataSet.addAll(CoderMetadataSet.java:78)
at org.eclipse.jetty.websocket.jsr356.server.AnnotatedServerEndpointMetadata.<init>(AnnotatedServerEndpointMetadata.java:51)
at org.eclipse.jetty.websocket.jsr356.server.ServerContainer.getServerEndpointMetadata(ServerContainer.java:162)
at org.eclipse.jetty.websocket.jsr356.server.ServerContainer.addEndpoint(ServerContainer.java:87)
at org.eclipse.jetty.websocket.jsr356.server.ServerContainer.doStart(ServerContainer.java:139)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
at org.eclipse.jetty.util.component.ContainerLifeCycle.start(ContainerLifeCycle.java:132)
at org.eclipse.jetty.util.component.ContainerLifeCycle.doStart(ContainerLifeCycle.java:106)
at org.eclipse.jetty.server.handler.AbstractHandler.doStart(AbstractHandler.java:61)
at org.eclipse.jetty.server.handler.ScopedHandler.doStart(ScopedHandler.java:120)
at org.eclipse.jetty.server.handler.ContextHandler.startContext(ContextHandler.java:803)
at org.eclipse.jetty.servlet.ServletContextHandler.startContext(ServletContextHandler.java:344)
at org.eclipse.jetty.webapp.WebAppContext.startWebapp(WebAppContext.java:1379)
at org.eclipse.jetty.webapp.WebAppContext.startContext(WebAppContext.java:1341)
at org.eclipse.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:772)
at org.eclipse.jetty.servlet.ServletContextHandler.doStart(ServletContextHandler.java:261)
at org.eclipse.jetty.webapp.WebAppContext.doStart(WebAppContext.java:517)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
at org.eclipse.jetty.deploy.bindings.StandardStarter.processBinding(StandardStarter.java:41)
at org.eclipse.jetty.deploy.AppLifeCycle.runBindings(AppLifeCycle.java:188)
at org.eclipse.jetty.deploy.DeploymentManager.requestAppGoal(DeploymentManager.java:499)
at org.eclipse.jetty.deploy.DeploymentManager.addApp(DeploymentManager.java:147)
at org.eclipse.jetty.deploy.providers.ScanningAppProvider.fileAdded(ScanningAppProvider.java:180)
at org.eclipse.jetty.deploy.providers.WebAppProvider.fileAdded(WebAppProvider.java:458)
at org.eclipse.jetty.deploy.providers.ScanningAppProvider$1.fileAdded(ScanningAppProvider.java:64)
at org.eclipse.jetty.util.Scanner.reportAddition(Scanner.java:610)
at org.eclipse.jetty.util.Scanner.reportDifferences(Scanner.java:529)
at org.eclipse.jetty.util.Scanner.scan(Scanner.java:392)
at org.eclipse.jetty.util.Scanner$1.run(Scanner.java:329)
at java.util.TimerThread.mainLoop(Timer.java:555)
at java.util.TimerThread.run(Timer.java:505)
How can I avoid needing an encoder/decoder which is doing the exact same thing for every type of object I want to send/receive?
import com.fasterxml.jackson.core.type.TypeReference;
import com.fasterxml.jackson.databind.ObjectMapper;
import javax.websocket.DecodeException;
import javax.websocket.Decoder;
import javax.websocket.EndpointConfig;
import java.io.IOException;
import java.io.Reader;
public class DataDecoder implements Decoder.TextStream<Object> {
#Override
public Object decode(Reader reader) throws DecodeException, IOException {
ObjectMapper mapper = new ObjectMapper();
return mapper.readValue(reader, new TypeReference<Object>() {
});
}
#Override
public void init(EndpointConfig config) {
}
#Override
public void destroy() {
}
}
#ClientEndpoint (decoders = { DataDecoder.class })
public class SomeClass {
private WebSocketContainer container;
private Session session;
private Object objectModel;
#Override
public SomeType getAimedJobs() throws DatabaseException, ModuleNotLoadException, NamingException {
SomeType someType = new SomeType ();
ObjectMapper mapper = new ObjectMapper();
try {
String jsonInString = mapper.writeValueAsString(this.objectModel);
someType = mapper.readValue(jsonInString, SomeType.class);
} catch (IOException e) {
e.printStackTrace();
}
return someType ;
}
#OnOpen
public void onOpen(Session session) {
this.session = session;
}
#OnMessage
public void onMessage(Object o) {
this.objectModel = o;
}
#OnClose
public void onClose(CloseReason reason) {
System.out.println("WebSocket connection closed with CloseCode: " + reason.getCloseCode());
this.session.Close();
}
}
Related
Can someone tell me how to add an attachment using ProducerTemplate?
I have been searching but I can not find an answer to my case.
I am using Camen 2.1 and I have these three clases:
MailSender2.java
import java.util.HashMap;
import java.util.Map;
import java.util.ResourceBundle;
import org.apache.camel.Exchange;
import org.apache.camel.ExchangePattern;
import org.apache.camel.ProducerTemplate;
public class MailSender2 extends TypeMail{
private static final ResourceBundle RES = ResourceBundle.getBundle("mail");
protected static final String MAIL_NOTIFICATION_ENDPOINT=RES.getString("mail.host.location").trim()+":"+RES.getString("mail.port").trim();
private Map<String, Object> header;
public MailSender2() {
this.header=new HashMap<>();
}
public void send(ProducerTemplate template) {
this.header.put("From", this.getT_from());
this.header.put("To", this.getT_to());
this.header.put("Subject", this.getT_subject());
this.header.put(Exchange.CONTENT_TYPE, "text/html; charset=UTF-8");
//this.getF_ficher() <-- I have here the file to attach
//this.getT_ficnon() <-- I have here the name ot the file
//this.getT_ficext() <-- I have here the extension ot the file
template.sendBodyAndHeaders(MAIL_NOTIFICATION_ENDPOINT, this.getT_mensaje(), header);
}
}
TypeMail.java:
public class TypeMail {
private String t_id;
private String t_from;
private String t_to;
private String t_subject;
private String t_mensaje;
private byte[] f_ficher;
private String t_ficnon;
private String t_ficext;
public String getT_id() {
return t_id;
}
public void setT_id(String t_id) {
this.t_id = t_id;
}
public String getT_from() {
return t_from;
}
public void setT_from(String t_from) {
this.t_from = t_from;
}
public String getT_to() {
return t_to;
}
public void setT_to(String t_to) {
this.t_to = t_to;
}
public String getT_subject() {
return t_subject;
}
public void setT_subject(String t_subject) {
this.t_subject = t_subject;
}
public String getT_mensaje() {
return t_mensaje;
}
public void setT_mensaje(String t_mensaje) {
this.t_mensaje = t_mensaje;
}
public byte[] getF_ficher() {
return f_ficher;
}
public void setF_ficher(byte[] f_ficher) {
this.f_ficher = f_ficher;
}
public String getT_ficnon() {
return t_ficnon;
}
public void setT_ficnon(String t_ficnon) {
this.t_ficnon = t_ficnon;
}
public String getT_ficext() {
return t_ficext;
}
public void setT_ficext(String t_ficext) {
this.t_ficext = t_ficext;
}
}
MailCommunicationTransformer.java:
import org.apache.camel.CamelContext;
import org.apache.camel.Exchange;
import org.apache.camel.ProducerTemplate;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.ws.soap.client.SoapFaultClientException;
public class MailCommunicationTransformer {
MailSender2 mailSender = null;
static Logger logger = LoggerFactory.getLogger(MailCommunicationTransformer.class);
public MailCommunicationTransformer()
{
}
public MailLog transform(Object actualMessage, Exchange exchange, CamelContext context)
{
mailSender = exchange.getIn().getBody(MailSender2.class);
try {
MailSenderDAO mailSenderDAO = (MailSenderDAO)context.getRegistry().lookup("MailSenderDAO");
mailSenderDAO.validarInput(mailSender);
if (mailSender!=null) {
ProducerTemplate template=exchange.getContext().createProducerTemplate();
try {
mailSender.send(template);
}
catch (Throwable ex) {
ex.printStackTrace();
exchange.setProperty(Exchange.EXCEPTION_CAUGHT,ex);
}
}
}catch (MailException me) {
me.printStackTrace();
exchange.setProperty(Exchange.EXCEPTION_CAUGHT,me);
}
Throwable e = exchange.getProperty(Exchange.EXCEPTION_CAUGHT,
Throwable.class);
String response = "OK";
if (e != null) {
StringBuffer mensaje = new StringBuffer();
if (e instanceof SoapFaultClientException) {
mensaje.append("MAIL fault exception: CLIENT. ");
} else {
mensaje.append("MAIL fault exception: MAIL. ");
}
logger.info("MailCommunicationTransformer",e);
while (e != null) {
e.printStackTrace();
mensaje.append(e.getMessage());
e = e.getCause();
}
response = mensaje.toString();
}
MailLog log = new MailLog(mailSender, response); //, protocolo
return log;
}
}
In TypeMail I have the file in f_ficher, and the fileName (t_ficnon) and extension (t_ficext), but I can not find how to attach this file in MailSender2 before template.sendBodyAndHeaders(.....)
Any help would be very appreciated.
Regards.
Perhaps I don't fully understand your question, but the ProducerTemplate don't know about the message type.
You just send a body and perhaps also headers to an endpoint.
Therefore the body just needs to be a fully constructed MimeMessage object as documented in the Camel Mail docs.
You can simply construct the mail message with Java and then use the object with the ProducerTemplate (what you already do).
template.sendBodyAndHeaders("your-smtp-endpoint", yourMimeMessageInstance, yourHeaderMap);
Thanks for the answer!
But, finally, I could do it this way:
new class EmailProcessor.java
import java.io.File;
import java.io.IOException;
import java.net.URL;
import java.nio.file.Files;
import java.util.Objects;
import java.util.ResourceBundle;
import javax.activation.DataHandler;
import org.apache.camel.Exchange;
import org.apache.camel.Message;
import org.apache.camel.Processor;
import org.apache.commons.codec.binary.Base64;
public class EmailProcessor implements Processor {
// Atributos de la clase
private TypeMail typeMail;
public EmailProcessor(TypeMail typeMail) {
this.typeMail = typeMail;
}
#Override
public void process(Exchange exchange) throws Exception {
Message ms = exchange.getIn();
ms.setHeader("From", this.typeMail.getT_from());
ms.setHeader("To", this.typeMail.getT_to());
ms.setHeader("Subject", this.typeMail.getT_subject());
ms.setHeader(Exchange.CONTENT_TYPE, "text/html; charset=UTF-8");
ms.setBody("<p style='font-family: Calibri;'>" + this.typeMail.getT_mensaje() + "</p>");
if (this.typeMail.getF_ficher() != null) {
String mimeType = "application/pdf";
if ("zip".equals(typeMail.getT_ficext())) {
mimeType = "application/zip";
}
ms.addAttachment(typeMail.getT_ficnom() + "." + typeMail.getT_ficext(), new DataHandler(typeMail.getF_ficher(), mimeType));
}
}
}
MailSender.java:
import java.util.ResourceBundle;
import org.apache.camel.ExchangePattern;
import org.apache.camel.ProducerTemplate;
public class MailSender extends TypeMail{
private static final ResourceBundle RES = ResourceBundle.getBundle("mail");
protected static final String MAIL_NOTIFICATION_ENDPOINT=RES.getString("mail.host.location").trim()+":"+RES.getString("mail.port").trim();
public MailSender() {
}
public void send(ProducerTemplate template) {
template.send(MAIL_NOTIFICATION_ENDPOINT, ExchangePattern.InOnly, new EmailProcessor(this));
}
}
What am I doing wrong, My below kafka stream program giving issue while streaming the data, "Cannot deserialize instance of com.kafka.productiontest.models.TimeOff out of START_ARRAY token ".
I have a topic timeOffs2 which contain time offs information with key timeOffID and value is of type object which contain employeeId. I just want to group all time offs for employee key and write to the store.
For store key will be employeeId and value will be list of timeoffs.
Program properties and streaming logic:
public Properties getKafkaProperties() throws UnknownHostException {
InetAddress myHost = InetAddress.getLocalHost();
Properties kafkaStreamProperties = new Properties();
kafkaStreamProperties.put(StreamsConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapServers);
kafkaStreamProperties.put(StreamsConfig.DEFAULT_KEY_SERDE_CLASS_CONFIG, Serdes.String().getClass());
kafkaStreamProperties.put(StreamsConfig.DEFAULT_VALUE_SERDE_CLASS_CONFIG, TimeOffSerde.class);
kafkaStreamProperties.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, "org.apache.kafka.common.serialization.StringSerializer");
kafkaStreamProperties.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, "com.kafka.productiontest.models.TimeOffSerializer");
kafkaStreamProperties.put(StreamsConfig.APPLICATION_ID_CONFIG, application_id );
kafkaStreamProperties.put(StreamsConfig.APPLICATION_SERVER_CONFIG, myHost.getHostName() + ":" + port);
return kafkaStreamProperties;
}
String topic = "timeOffs2";
StreamsBuilder builder = new StreamsBuilder();
KStream<String, TimeOff> source = builder.stream(topic);
KTable<String, ArrayList<TimeOff>> newStore = source.groupBy((k, v) -> v.getEmployeeId())
.aggregate(ArrayList::new,
(key, value, aggregate) -> {
aggregate.add(value);
return aggregate;
}, Materialized.as("NewStore").withValueSerde(TimeOffListSerde(TimeOffSerde)));
final Topology topology = builder.build();
final KafkaStreams streams = new KafkaStreams(topology, getKafkaProperties());
TimeOffSerializer.java
ackage com.kafka.productiontest.models;
import com.fasterxml.jackson.databind.ObjectMapper;
import org.apache.kafka.common.serialization.Serializer;
import java.util.Map;
public class TimeOffSerializer implements Serializer {
#Override
public void configure(Map configs, boolean isKey) {
}
#Override
public byte[] serialize(String topic, Object data) {
byte[] retVal = null;
ObjectMapper objectMapper = new ObjectMapper();
try {
retVal = objectMapper.writeValueAsString(data).getBytes();
} catch (Exception e) {
e.printStackTrace();
}
return retVal;
}
#Override
public void close() {
}
}
TimeOffDeserializer.java
package com.kafka.productiontest.models;
import com.fasterxml.jackson.databind.ObjectMapper;
import org.apache.kafka.common.serialization.Deserializer ;
import java.util.Map;
public class TimeOffDeserializer implements Deserializer {
#Override
public void configure(Map configs, boolean isKey) {
}
#Override
public TimeOff deserialize(String arg0, byte[] arg1) {
ObjectMapper mapper = new ObjectMapper();
TimeOff timeOff = null;
try {
timeOff = mapper.readValue(arg1, TimeOff.class);
} catch (Exception e) {
e.printStackTrace();
}
return timeOff;
}
#Override
public void close() {
}
}
TimeOffSerde.java
package com.kafka.productiontest.models;
import org.apache.kafka.common.serialization.Deserializer;
import org.apache.kafka.common.serialization.Serde;
import org.apache.kafka.common.serialization.Serdes;
import org.apache.kafka.common.serialization.Serializer;
import java.util.Map;
public class TimeOffSerde implements Serde<Object> {
private final Serde inner;
public TimeOffSerde(){
inner = Serdes.serdeFrom(new TimeOffSerializer(), new TimeOffDeserializer());
}
#Override
public void configure(Map<String, ?> configs, boolean isKey) {
inner.serializer().configure(configs, isKey);
inner.deserializer().configure(configs, isKey);
}
#Override
public void close() {
inner.serializer().close();
inner.deserializer().close();
}
#Override
public Serializer<Object> serializer() {
return inner.serializer();
}
#Override
public Deserializer<Object> deserializer() {
return inner.deserializer();
}
}
TimeOffListSerializer.java
package com.kafka.productiontest.models;
import org.apache.kafka.common.serialization.Serializer;
import java.io.ByteArrayOutputStream;
import java.io.DataOutputStream;
import java.sql.Time;
import java.util.ArrayList;
import java.util.Iterator;
import java.util.Map;
public class TimeOffListSerializer implements Serializer<ArrayList<TimeOff>> {
private Serializer<TimeOff> inner;
public TimeOffListSerializer(Serializer<TimeOff> inner) {
this.inner = inner;
}
#Override
public void configure(Map<String, ?> configs, boolean isKey) {
}
#Override
public byte[] serialize(String topic, ArrayList<TimeOff> data) {
final int size = data.size();
final ByteArrayOutputStream baos = new ByteArrayOutputStream();
final DataOutputStream dos = new DataOutputStream(baos);
final Iterator<TimeOff> iterator = data.iterator();
try {
dos.writeInt(size);
while (iterator.hasNext()) {
final byte[] bytes = inner.serialize(topic, iterator.next());
dos.writeInt(bytes.length);
dos.write(bytes);
}
}catch (Exception ex) {
}
return baos.toByteArray();
}
#Override
public void close() {
inner.close();
}
}
TimeOffListDeserializer.java
package com.kafka.productiontest.models;
import org.apache.kafka.common.serialization.Deserializer;
import java.io.ByteArrayInputStream;
import java.io.DataInputStream;
import java.io.IOException;
import java.util.ArrayList;
import java.util.Map;
public class TimeOffListDeserializer implements Deserializer<ArrayList<TimeOff>> {
private final Deserializer<TimeOff> valueDeserializer;
public TimeOffListDeserializer(final Deserializer<TimeOff> valueDeserializer) {
this.valueDeserializer = valueDeserializer;
}
#Override
public void configure(Map<String, ?> configs, boolean isKey) {
}
#Override
public ArrayList<TimeOff> deserialize(String topic, byte[] data) {
if (data == null || data.length == 0) {
return null;
}
final ArrayList<TimeOff> arrayList = new ArrayList<>();
final DataInputStream dataInputStream = new DataInputStream(new ByteArrayInputStream(data));
try {
final int records = dataInputStream.readInt();
for (int i = 0; i < records; i++) {
final byte[] valueBytes = new byte[dataInputStream.readInt()];
dataInputStream.read(valueBytes);
arrayList.add(valueDeserializer.deserialize(topic, valueBytes));
}
} catch (IOException e) {
throw new RuntimeException("Unable to deserialize ArrayList", e);
}
return arrayList;
}
#Override
public void close() {
}
}
TimeOffListSerde.java
package com.kafka.productiontest.models;
import org.apache.kafka.common.serialization.Deserializer;
import org.apache.kafka.common.serialization.Serde;
import org.apache.kafka.common.serialization.Serdes;
import org.apache.kafka.common.serialization.Serializer;
import java.util.ArrayList;
import java.util.Map;
public class TimeOffListSerde implements Serde<ArrayList<TimeOff>> {
private Serde<ArrayList<TimeOff>> inner;
public TimeOffListSerde() {
}
public TimeOffListSerde(Serde<TimeOff> serde){
inner = Serdes.serdeFrom(new TimeOffListSerializer(serde.serializer()), new TimeOffListDeserializer(serde.deserializer()));
}
#Override
public void configure(Map<String, ?> configs, boolean isKey) {
inner.serializer().configure(configs, isKey);
inner.deserializer().configure(configs, isKey);
}
#Override
public void close() {
inner.serializer().close();
inner.deserializer().close();
}
#Override
public Serializer<ArrayList<TimeOff>> serializer() {
return inner.serializer();
}
#Override
public Deserializer<ArrayList<TimeOff>> deserializer() {
return inner.deserializer();
}
}
I think issue is in this part with withValueSerde. I can not compile with this code. But if I remove withValueSerde, it is giving me this issue "Can not deserialize TimeOff object". Can you please help and guide what I am doing wrong.
KTable<String, ArrayList<TimeOff>> newStore = source.groupBy((k, v) -> v.getEmployeeId())
.aggregate(ArrayList::new,
(key, value, aggregate) -> {
aggregate.add(value);
return aggregate;
}, Materialized.as("NewStore").withValueSerde(TimeOffListSerde(TimeOffSerde)));
Looking at your code I can see several issues:
TimeOffSerde - It should implement Serde<TimeOff> not Serde<Object>
You don't pass types for Key and Value in Materialized, so it assume it is Object
So your streaming part should be something like:
KTable<String, ArrayList<TimeOff>> newStore = source.groupBy((k, v) -> v.getEmployeeId())
.aggregate(ArrayList::new,
(key, value, aggregate) -> {
aggregate.add(value);
return aggregate;
}, Materialized.<String, ArrayList<TimeOff>, KeyValueStore<Bytes, byte[]>>as("NewStore").withValueSerde(new TimeOffListSerde(new TimeOffSerde())));
NOTICE: Rember to clear state store directory after modification.
How to serialize an object with Jackson if one of getters is throwing an exception?
Example:
public class Example {
public String getSomeField() {
//some logic which will throw in example NPE
throw new NullPointerException();
}
}
Ideally I would like to get JSON:
{"someField":"null"}
or
{"someField":"NPE"}
Probably the most generic way would be implementing custom BeanPropertyWriter. You can register it by creating BeanSerializerModifier class. Below example shows how to do that.
import com.fasterxml.jackson.core.JsonGenerator;
import com.fasterxml.jackson.databind.BeanDescription;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.fasterxml.jackson.databind.SerializationConfig;
import com.fasterxml.jackson.databind.SerializerProvider;
import com.fasterxml.jackson.databind.module.SimpleModule;
import com.fasterxml.jackson.databind.ser.BeanPropertyWriter;
import com.fasterxml.jackson.databind.ser.BeanSerializerModifier;
import java.util.List;
import java.util.stream.Collectors;
public class JsonApp {
public static void main(String[] args) throws Exception {
SimpleModule module = new SimpleModule();
module.setSerializerModifier(new BeanSerializerModifier() {
#Override
public List<BeanPropertyWriter> changeProperties(SerializationConfig config, BeanDescription beanDesc, List<BeanPropertyWriter> beanProperties) {
if (beanDesc.getBeanClass() == Response.class) {
return beanProperties.stream()
.map(SilentExceptionBeanPropertyWriter::new)
.collect(Collectors.toList());
}
return super.changeProperties(config, beanDesc, beanProperties);
}
});
ObjectMapper mapper = new ObjectMapper();
mapper.registerModule(module);
System.out.println(mapper.writeValueAsString(new Response(1, "ONE")));
System.out.println(mapper.writeValueAsString(new Response(-1, "MINUS")));
System.out.println(mapper.writeValueAsString(new Response(-1, null)));
}
}
class SilentExceptionBeanPropertyWriter extends BeanPropertyWriter {
public SilentExceptionBeanPropertyWriter(BeanPropertyWriter base) {
super(base);
}
#Override
public void serializeAsField(Object bean, JsonGenerator gen, SerializerProvider prov) throws Exception {
try {
super.serializeAsField(bean, gen, prov);
} catch (Exception e) {
Throwable cause = e.getCause();
gen.writeFieldName(_name);
gen.writeString(cause.getClass().getName() + ":" + cause.getMessage());
}
}
}
class Response {
private int count;
private String message;
public Response(int count, String message) {
this.count = count;
this.message = message;
}
public int getCount() {
if (count < 0) {
throw new IllegalStateException("Count is less than ZERO!");
}
return count;
}
public void setCount(int count) {
this.count = count;
}
public String getMessage() {
if (message == null) {
throw new NullPointerException("message can not be null!");
}
return message;
}
public void setMessage(String message) {
this.message = message;
}
}
Above example prints:
{"count":1,"message":"ONE"}
{"count":"java.lang.IllegalStateException:Count is less than ZERO!","message":"MINUS"}
{"count":"java.lang.IllegalStateException:Count is less than ZERO!","message":"java.lang.NullPointerException:message can not be null!"}
I have a controller GGSNAcceptController.java:
package com.viettel.pcrf.controller;
import com.viettel.fw.Exception.LogicException;
import com.viettel.fw.dto.BaseMessage;
import com.viettel.fw.web.controller.BaseController;
import com.viettel.pcrf.common.Const;
import com.viettel.pcrf.dto.GgsnAcceptDTO;
import com.viettel.pcrf.webconfig.service.GgsnAcceptService;
import java.io.Serializable;
import java.util.List;
import javax.annotation.PostConstruct;
import javax.faces.bean.ManagedBean;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.context.annotation.Scope;
import org.springframework.stereotype.Component;
#Component
#ManagedBean(name = "ggsnAcceptController")
#Scope("view")
public class GGSNAcceptController extends BaseController implements Serializable, BaseTableCTRL {
/**
* VARIABLES & GETTER/SETTER
*/
private GgsnAcceptDTO ggsnAccept;
private GgsnAcceptDTO selectedGgsnAccept;
private List<GgsnAcceptDTO> listGgsnAccept;
public GgsnAcceptDTO getGgsnAccept() {
return ggsnAccept;
}
public void setGgsnAccept(GgsnAcceptDTO ggsnAccept) {
this.ggsnAccept = ggsnAccept;
}
public GgsnAcceptDTO getSelectedGgsnAccept() {
return selectedGgsnAccept;
}
public void setSelectedGgsnAccept(GgsnAcceptDTO selectedGgsnAccept) {
this.selectedGgsnAccept = selectedGgsnAccept;
}
public List<GgsnAcceptDTO> getListGgsnAccept() {
return listGgsnAccept;
}
public void setListGgsnAccept(List<GgsnAcceptDTO> listGgsnAccept) {
this.listGgsnAccept = listGgsnAccept;
}
/**
* SERVICE
*
*/
#Autowired
private GgsnAcceptService ggsnAcceptServ;
/**
* INIT
*
*/
#PostConstruct
#Override
public void init() {
updateCtrl();
}
#Override
public void updateCtrl() {
clear();
System.out.println(ggsnAcceptServ == null);
listGgsnAccept = ggsnAcceptServ.findAll();
}
private String ggsnAcceptSelected;
#Override
public void updateDB() {
try {
if (ggsnAccept == null) {
throw new LogicException("nullSelected", "GGSN Config is not yet selected!");
}
if (formStatus == Const.BTN_ADD && ggsnAcceptServ.isExisted(ggsnAccept)) {
throw new LogicException("insertExisted", "GGSN Config existed!");
}
// if (systemCfgSelected != null && systemCfgSelected.equals(systemCfg.getSystemCfgName()) && langServ.isExisted(systemCfg)) {
// throw new LogicException("updateExisted", "GGSN Config is existed!");
// }
BaseMessage msg = ggsnAcceptServ.updateGgsn(ggsnAccept);
if (msg.isSuccess()) {
reportSuccess("msgInfo", "Success");
}
updateCtrl();
selectedGgsnAccept = (GgsnAcceptDTO) msg.getOutputObject();
} catch (LogicException ex) {
reportError("msgInfo", ex.getDescription());
} catch (Exception ex) {
logger.error(ex, ex);
}
}
#Override
public void deleteDB() {
try {
if (ggsnAccept == null) {
throw new LogicException("nullSelected", "GGSN Config is not selected yet!");
}
BaseMessage msg = ggsnAcceptServ.deleteGgsn(ggsnAccept);
if (msg.isSuccess()) {
reportSuccess("msgInfo", "msg.delete.success");
}
updateCtrl();
} catch (LogicException ex) {
reportError("msgInfo", ex.getDescription());
} catch (Exception ex) {
logger.error(ex, ex);
}
}
#Override
public void prepareAdd() {
ggsnAccept = new GgsnAcceptDTO();
selectedGgsnAccept = null;
}
#Override
public void prepareEdit() {
if (selectedGgsnAccept != null) {
ggsnAccept = selectedGgsnAccept;
}
}
#Override
public void prepareDelete() {
if (selectedGgsnAccept != null) {
ggsnAccept = selectedGgsnAccept;
}
}
#Override
public void clear() {
selectedGgsnAccept = null;
ggsnAccept = null;
}
#Override
public void onRowChangeListener() {
throw new UnsupportedOperationException("Not supported yet."); //To change body of generated methods, choose Tools | Templates.
}
}
An interface GgsnAcceptService.java:
package com.viettel.pcrf.webconfig.service;
import com.viettel.fw.dto.BaseMessage;
import com.viettel.pcrf.dto.GgsnAcceptDTO;
import java.util.List;
public interface GgsnAcceptService {
public List<GgsnAcceptDTO> findAll();
public List<GgsnAcceptDTO> findAll(List filters);
public BaseMessage updateGgsn(GgsnAcceptDTO ggsn) throws Exception;
public BaseMessage deleteGgsn(GgsnAcceptDTO ggsn) throws Exception;
public boolean isExisted(GgsnAcceptDTO ggsn) throws Exception;
}
And a class implement above interface:
package com.viettel.pcrf.webconfig.service;
import com.viettel.fw.common.util.extjs.FilterRequest;
import com.viettel.fw.dto.BaseMessage;
import com.viettel.pcrf.webconfig.repo.GgsnAcceptRepository;
import com.viettel.pcrf.common.util.mapper.GgsnAcceptMapper;
import com.viettel.pcrf.dto.GgsnAcceptDTO;
import java.util.List;
import org.apache.log4j.Logger;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Service;
import com.viettel.service.BaseServiceImpl;
import java.util.ArrayList;
#Service
public class GgsnAcceptServiceImpl extends BaseServiceImpl implements GgsnAcceptService {
private GgsnAcceptMapper mapper = new GgsnAcceptMapper();
#Autowired
private GgsnAcceptRepository repository;
public Logger logger = Logger.getLogger(GgsnAcceptService.class);
#Override
public List<GgsnAcceptDTO> findAll(List filters) {
return mapper.toDtoBean(repository.findAll(repository.toPredicate(filters)));
}
#Override
public List<GgsnAcceptDTO> findAll() {
return mapper.toDtoBean(repository.findAll());
}
#Override
public BaseMessage updateGgsn(GgsnAcceptDTO ggsn) throws Exception {
BaseMessage msg = new BaseMessage();
GgsnAcceptDTO newGgsn = mapper.toDtoBean(repository.saveAndFlush(mapper.toPersistenceBean(ggsn)));
msg.setOutputObject(newGgsn);
msg.setSuccess(true);
return msg;
}
#Override
public boolean isExisted(GgsnAcceptDTO ggsn) throws Exception {
List<FilterRequest> listReq = new ArrayList<>();
listReq.add(new FilterRequest("IP", ggsn.getIp()));
return repository.findOne(repository.toPredicate(listReq)) != null;
}
#Override
public BaseMessage deleteGgsn(GgsnAcceptDTO ggsn) throws Exception {
BaseMessage msg = new BaseMessage();
repository.delete(mapper.toPersistenceBean(ggsn));
msg.setSuccess(true);
return msg;
}
}
I got an null error when trying to access a page call controller. Is there anything wrong with my code ?
My property ggsnAcceptServ always null although i have already set autowired for it. I'm new in Spring, please help to explain why this property null. Any help would be great.
You have a problem mixing jsf and spring:
#Component
#ManagedBean(name = "ggsnAcceptController")
#Scope("view")
Your controller will be executed in jsf context not in spring context. Thats why autowiering not working.
We're generating an XML with Java (org.w3c.dom.Node), using essentially
parent.appendChild(doc.createElement(nodeName));
this generates an XML where nodes are sorted by the order of calling the 'appendChild'. The final XML, however, needs to adhere to a given XSD. Our code can ensure that valid value types, mandatory fields etc. are ok. I am however struggling with the node order.
Is there any approach to either:
On insert ensure that the node order matches the XSD
Re-order the entire XML after creation according to the XSD
To clarify:
What we have is:
<myNodeA>...</myNodeA>
<myNodeC>...</myNodeC>
<myNodeB>...</myNodeB>
And what the XSD wants is:
<myNodeA>...</myNodeA>
<myNodeB>...</myNodeB>
<myNodeC>...</myNodeC>
Thanks, Simon
I've done this before by traversing the schema and then pulling relevant pieces from the XML model and streaming it along the way.
There are multiple xsd model libraries to use:
xsom
xerces
xmlschema
Here's an example using xsom (which can be replaced by one of the above) and xom (which can be replaced with dom)
Main:
package main;
import java.io.File;
import java.io.FileOutputStream;
import java.io.OutputStream;
import javax.xml.stream.XMLOutputFactory;
import javax.xml.stream.XMLStreamWriter;
import node.xom.WrappedDocument;
import nu.xom.Builder;
import nu.xom.Document;
import nu.xom.Element;
import reorder.xsom.UncheckedXMLStreamWriter;
import reorder.xsom.XSVisitorWriteOrdered;
import com.sun.xml.xsom.XSElementDecl;
import com.sun.xml.xsom.XSSchemaSet;
import com.sun.xml.xsom.parser.XSOMParser;
public class ReorderXmlToXsd {
public static void main(String[] args) throws Exception {
File unorderedXml = new File("unordered.xml");
File xsd = new File("your.xsd");
File orderedXml = new File("ordered.xml");
XSOMParser p = new XSOMParser();
p.parse(xsd);
XSSchemaSet parsed = p.getResult();
Builder xom = new Builder();
Document unorderedDoc = xom.build(unorderedXml);
Element unorderedRoot = unorderedDoc.getRootElement();
XSElementDecl root = parsed.getElementDecl(
unorderedRoot.getNamespaceURI(),
unorderedRoot.getLocalName());
XMLOutputFactory stax = XMLOutputFactory.newInstance();
try (OutputStream to = new FileOutputStream(orderedXml)) {
XMLStreamWriter using = stax.createXMLStreamWriter(to, "UTF-8");
root.visit(
new XSVisitorWriteOrdered(
new WrappedDocument(unorderedDoc),
new UncheckedXMLStreamWriter(using)));
}
}
}
The actual reordering logic. You will probably have to modify this further. For example, I didn't have to deal with the xsd:any for my project.
package reorder.xsom;
import node.WrappedNode;
import com.sun.xml.xsom.*;
import com.sun.xml.xsom.visitor.XSVisitor;
public class XSVisitorWriteOrdered implements XSVisitor {
private final WrappedNode currNode;
private final UncheckedXMLStreamWriter writeTo;
public XSVisitorWriteOrdered(WrappedNode currNode, UncheckedXMLStreamWriter writeTo) {
this.currNode = currNode;
this.writeTo = writeTo;
}
#Override
public void attributeUse(XSAttributeUse use) {
attributeDecl(use.getDecl());
}
#Override
public void modelGroupDecl(XSModelGroupDecl decl) {
modelGroup(decl.getModelGroup());
}
#Override
public void modelGroup(XSModelGroup model) {
for (XSParticle term : model.getChildren()) {
term.visit(this);
}
}
#Override
public void particle(XSParticle particle) {
XSTerm term = particle.getTerm();
term.visit(this);
}
#Override
public void complexType(XSComplexType complex) {
for (XSAttributeUse use : complex.getAttributeUses()) {
attributeUse(use);
}
XSContentType contentType = complex.getContentType();
contentType.visit(this);
}
#Override
public void elementDecl(XSElementDecl decl) {
String namespaceUri = decl.getTargetNamespace();
String localName = decl.getName();
for (WrappedNode child : currNode.getChildElements(namespaceUri, localName)) {
writeTo.writeStartElement(namespaceUri, localName);
XSType type = decl.getType();
type.visit(new XSVisitorWriteOrdered(child, writeTo));
writeTo.writeEndElement();
}
}
#Override
public void attributeDecl(XSAttributeDecl decl) {
String namespaceUri = decl.getTargetNamespace();
String localName = decl.getName();
WrappedNode attribute = currNode.getAttribute(namespaceUri, localName);
if (attribute != null) {
String value = attribute.getValue();
if (value != null) {
writeTo.writeAttribute(namespaceUri, localName, value);
}
}
}
#Override
public void simpleType(XSSimpleType simpleType) {
String value = currNode.getValue();
if (value != null) {
writeTo.writeCharacters(value);
}
}
#Override
public void empty(XSContentType empty) {}
#Override
public void facet(XSFacet facet) {}
#Override
public void annotation(XSAnnotation ann) {}
#Override
public void schema(XSSchema schema) {}
#Override
public void notation(XSNotation notation) {}
#Override
public void identityConstraint(XSIdentityConstraint decl) {}
#Override
public void xpath(XSXPath xp) {}
#Override
public void wildcard(XSWildcard wc) {}
#Override
public void attGroupDecl(XSAttGroupDecl decl) {}
}
Stax writer:
package reorder.xsom;
import javax.xml.stream.XMLStreamException;
import javax.xml.stream.XMLStreamWriter;
public class UncheckedXMLStreamWriter {
private final XMLStreamWriter real;
public UncheckedXMLStreamWriter(XMLStreamWriter delegate) {
this.real = delegate;
}
public void writeStartElement(String namespaceUri, String localName) {
try {
real.writeStartElement(namespaceUri, localName);
} catch (XMLStreamException e) {
throw new RuntimeException(e);
}
}
public void writeEndElement() {
try {
real.writeEndElement();
} catch (XMLStreamException e) {
throw new RuntimeException(e);
}
}
public void writeAttribute(String namespaceUri, String localName, String value) {
try {
real.writeAttribute(namespaceUri, localName, value);
} catch (XMLStreamException e) {
throw new RuntimeException(e);
}
}
public void writeCharacters(String value) {
try {
real.writeCharacters(value);
} catch (XMLStreamException e) {
throw new RuntimeException(e);
}
}
}
The xml view:
package node;
import java.util.List;
import javax.annotation.Nullable;
public interface WrappedNode {
List<? extends WrappedNode> getChildElements(String namespaceUri, String localName);
#Nullable
WrappedNode getAttribute(String namespaceUri, String localName);
#Nullable
String getValue();
}
XOM implementation:
Document:
package node.xom;
import java.util.Collections;
import java.util.List;
import node.WrappedNode;
import nu.xom.Document;
import nu.xom.Element;
public class WrappedDocument implements WrappedNode {
private final Document d;
public WrappedDocument(Document d) {
this.d = d;
}
#Override
public List<WrappedElement> getChildElements(String namespaceUri, String localName) {
Element root = d.getRootElement();
if (isAt(root, namespaceUri, localName)) {
return Collections.singletonList(new WrappedElement(root));
}
return Collections.emptyList();
}
#Override
public WrappedAttribute getAttribute(String namespaceUri, String localName) {
throw new UnsupportedOperationException();
}
#Override
public String getValue() {
throw new UnsupportedOperationException();
}
#Override
public String toString() {
return d.toString();
}
private static boolean isAt(Element e, String namespaceUri, String localName) {
return namespaceUri.equals(e.getNamespaceURI())
&& localName.equals(e.getLocalName());
}
}
Element:
package node.xom;
import java.util.AbstractList;
import java.util.List;
import node.WrappedNode;
import nu.xom.Attribute;
import nu.xom.Element;
import nu.xom.Elements;
class WrappedElement implements WrappedNode {
private final Element e;
WrappedElement(Element e) {
this.e = e;
}
#Override
public List<WrappedElement> getChildElements(String namespaceUri, String localName) {
return asList(e.getChildElements(localName, namespaceUri));
}
#Override
public WrappedAttribute getAttribute(String namespaceUri, String localName) {
Attribute attribute = e.getAttribute(localName, namespaceUri);
return (attribute != null) ? new WrappedAttribute(attribute) : null;
}
#Override
public String getValue() {
return e.getValue();
}
#Override
public String toString() {
return e.toString();
}
private static List<WrappedElement> asList(final Elements eles) {
return new AbstractList<WrappedElement>() {
#Override
public WrappedElement get(int index) {
return new WrappedElement(eles.get(index));
}
#Override
public int size() {
return eles.size();
}
};
}
}
Attribute:
package node.xom;
import java.util.List;
import node.WrappedNode;
import nu.xom.Attribute;
class WrappedAttribute implements WrappedNode {
private final Attribute a;
WrappedAttribute(Attribute a) {
this.a = a;
}
#Override
public List<WrappedNode> getChildElements(String namespaceUri, String localName) {
throw new UnsupportedOperationException();
}
#Override
public WrappedNode getAttribute(String namespaceUri, String localName) {
throw new UnsupportedOperationException();
}
#Override
public String getValue() {
return a.getValue();
}
#Override
public String toString() {
return a.toString();
}
}
I don't think something like that exists. Simple reordering might be possible, but XML-Schema allows to define so many constraints, that it might be impossible to write a general tool that converts some XML document into a valid document according to some schema.
So, just build the document correctly in the first place.