I've written a bean class containing a HashMultiMap (from the Guava library). I would like to XML encode the bean using the JRE's XMLEncoder. Using a custom PersistenceDelegate I've successfully written the bean to file. However, when I attempt to deserialize the XML I get the exception:
java.lang.NoSuchMethodException: <unbound>=HashMultimap.put("pz1", "pz2")
What am I doing wrong?
// create the bean
SomeBean sb = new SomeBean();
// add some data
HashMultimap<String, String> stateMap = HashMultimap.create();
stateMap.put("pz1", "pz2");
stateMap.put("pz3", "pz4");
sb.setStateMap(stateMap);
// encode as xml
FileOutputStream os = new FileOutputStream("myXMLFile.xml");
XMLEncoder encoder = new XMLEncoder(os);
encoder.setPersistenceDelegate(HashMultimap.class, new CustomPersistenceDelegate());
encoder.writeObject(sb);
// decode the xml
FileInputStream is = new FileInputStream("myXMLFile.xml");
XMLDecoder decoder = new XMLDecoder(is);
Object deSerializedObject = decoder.readObject();
class CustomPersistenceDelegate extends DefaultPersistenceDelegate
{
protected Expression instantiate(Object oldInstance, Encoder out)
{
return new Expression(oldInstance, oldInstance.getClass(), "create", null);
}
protected void initialize(Class<?> type, Object oldInstance, Object newInstance,
Encoder out)
{
super.initialize(type, oldInstance, newInstance, out);
com.google.common.collect.HashMultimap<String, String> m =
(com.google.common.collect.HashMultimap) oldInstance;
for (Map.Entry<String, String> entry : m.entries())
{
out.writeStatement(new Statement(oldInstance, "put",
new Object[] { entry.getKey(), entry.getValue() }));
}
}
}
public class SomeBean
{
private HashMultimap<String, String> stateMap;
public HashMultimap<String, String> getStateMap()
{
return stateMap;
}
public void setStateMap(HashMultimap<String, String> stateMap)
{
this.stateMap = stateMap;
}
}
I don't have a solution (yet). But here is something which at least clarifies the problem. It seems that some change made in Java 7 build 15 and higher has broken the method look up that your Statement requires. If you add an ExceptionListener to the XmlEncoder, it gives you a better idea of how this is failing:
encoder.setExceptionListener(new ExceptionListener() {
#Override
public void exceptionThrown(Exception e) {
System.out.println("got exception. e=" + e);
e.printStackTrace();
}
});
You will see a full stacktrace then:
java.lang.Exception: Encoder: discarding statement HashMultimap.put(Object, Object);
at java.beans.Encoder.writeStatement(Encoder.java:306)
at java.beans.XMLEncoder.writeStatement(XMLEncoder.java:400)
at test2.XmlEncoderTest$CustomPersistenceDelegate.initialize(XmlEncoderTest.java:83)
at java.beans.PersistenceDelegate.writeObject(PersistenceDelegate.java:118)
at java.beans.Encoder.writeObject(Encoder.java:74)
at java.beans.XMLEncoder.writeObject(XMLEncoder.java:327)
at java.beans.Encoder.writeExpression(Encoder.java:330)
at java.beans.XMLEncoder.writeExpression(XMLEncoder.java:454)
at java.beans.PersistenceDelegate.writeObject(PersistenceDelegate.java:115)
at java.beans.Encoder.writeObject(Encoder.java:74)
at java.beans.XMLEncoder.writeObject(XMLEncoder.java:327)
at java.beans.Encoder.writeExpression(Encoder.java:330)
at java.beans.XMLEncoder.writeExpression(XMLEncoder.java:454)
at java.beans.DefaultPersistenceDelegate.doProperty(DefaultPersistenceDelegate.java:194)
at java.beans.DefaultPersistenceDelegate.initBean(DefaultPersistenceDelegate.java:253)
at java.beans.DefaultPersistenceDelegate.initialize(DefaultPersistenceDelegate.java:400)
at java.beans.PersistenceDelegate.writeObject(PersistenceDelegate.java:118)
at java.beans.Encoder.writeObject(Encoder.java:74)
at java.beans.XMLEncoder.writeObject(XMLEncoder.java:327)
at java.beans.Encoder.writeExpression(Encoder.java:330)
at java.beans.XMLEncoder.writeExpression(XMLEncoder.java:454)
at java.beans.PersistenceDelegate.writeObject(PersistenceDelegate.java:115)
at java.beans.Encoder.writeObject(Encoder.java:74)
at java.beans.XMLEncoder.writeObject(XMLEncoder.java:327)
at java.beans.Encoder.writeObject1(Encoder.java:258)
at java.beans.Encoder.cloneStatement(Encoder.java:271)
at java.beans.Encoder.writeStatement(Encoder.java:301)
at java.beans.XMLEncoder.writeStatement(XMLEncoder.java:400)
at java.beans.XMLEncoder.writeObject(XMLEncoder.java:330)
...
Caused by: java.lang.NoSuchMethodException: HashMultimap.put(Object, Object);
at java.beans.Statement.invokeInternal(Statement.java:313)
at java.beans.Statement.access$000(Statement.java:58)
at java.beans.Statement$2.run(Statement.java:185)
at java.security.AccessController.doPrivileged(Native Method)
at java.beans.Statement.invoke(Statement.java:182)
at java.beans.Statement.execute(Statement.java:173)
at java.beans.Encoder.writeStatement(Encoder.java:304)
... 51 more
The Caused by section shows that it failed to locate the put method. It looks to me like this happens because it can't match the method signature properly any more. It fails in the java beans MethodFinder, but since the source code is not included in JDK, I couldn't track it down well enough.
If I can find exact cause, I will update this. Just wanted to provide you with more information in the meantime.
UPDATE
I think it's a bug in these later versions. Here is a unit test which exposes the bug (or unexpected behavior) more directly. The failure below is exactly what is happening in your code:
#Test
public void testMethodFinder() throws Exception {
Method m0 = MethodFinder.findMethod(this.getClass(), "setUp", new Class<?>[0]);
assertNotNull(m0);
// this is okay, because method is declared in the type referenced
Method m = MethodFinder.findMethod(Multimap.class, "put", new Class<?>[] { Object.class, Object.class });
assertNotNull(m);
try {
// this fails, apparently because method is not declared in this subclass (is inherited from parent class)
Method m2 = MethodFinder.findMethod(HashMultimap.class, "put", new Class<?>[] { Object.class, Object.class });
assertNotNull(m2);
} catch (Exception e) {
System.out.println("got exception. e=" + e);
}
}
Related
I've only seen one thread containing information about the topic I've mentioned which is :
How to Deserialising Kafka AVRO messages using Apache Beam
However, after trying a few variations of kafkaserializers I still cannot deserialize kafka messages. Here's my code:
public class Readkafka {
private static final Logger LOG = LoggerFactory.getLogger(Readkafka.class);
public static void main(String[] args) throws IOException {
// Create the Pipeline object with the options we defined above.
Pipeline p = Pipeline.create(
PipelineOptionsFactory.fromArgs(args).withValidation().create());
PTransform<PBegin, PCollection<KV<action_states_pkey, String>>> kafka =
KafkaIO.<action_states_pkey, String>read()
.withBootstrapServers("mybootstrapserver")
.withTopic("action_States")
.withKeyDeserializer(MyClassKafkaAvroDeserializer.class)
.withValueDeserializer(StringDeserializer.class)
.updateConsumerProperties(ImmutableMap.of("schema.registry.url", (Object)"schemaregistryurl"))
.withMaxNumRecords(5)
.withoutMetadata();
p.apply(kafka)
.apply(Keys.<action_states_pkey>create())
}
where MyClassKafkaAvroDeserilizer is
public class MyClassKafkaAvroDeserializer extends
AbstractKafkaAvroDeserializer implements Deserializer<action_states_pkey> {
#Override
public void configure(Map<String, ?> configs, boolean isKey) {
configure(new KafkaAvroDeserializerConfig(configs));
}
#Override
public action_states_pkey deserialize(String s, byte[] bytes) {
return (action_states_pkey) this.deserialize(bytes);
}
#Override
public void close() {} }
and the class action_states_pkey is code generated from avro tools using
java -jar pathtoavrotools/avro-tools-1.8.1.jar compile schema pathtoschema/action_states_pkey.avsc destination path
where the action_states_pkey.avsc is literally
{"type":"record","name":"action_states_pkey","namespace":"namespace","fields":[{"name":"ad_id","type":["null","int"]},{"name":"action_id","type":["null","int"]},{"name":"state_id","type":["null","int"]}]}
With this code I'm getting the error :
Caused by: java.lang.ClassCastException: org.apache.avro.generic.GenericData$Record cannot be cast to my.mudah.beam.test.action_states_pkey
at my.mudah.beam.test.MyClassKafkaAvroDeserializer.deserialize(MyClassKafkaAvroDeserializer.java:20)
at my.mudah.beam.test.MyClassKafkaAvroDeserializer.deserialize(MyClassKafkaAvroDeserializer.java:1)
at org.apache.beam.sdk.io.kafka.KafkaUnboundedReader.advance(KafkaUnboundedReader.java:221)
at org.apache.beam.sdk.io.BoundedReadFromUnboundedSource$UnboundedToBoundedSourceAdapter$Reader.advanceWithBackoff(BoundedReadFromUnboundedSource.java:279)
at org.apache.beam.sdk.io.BoundedReadFromUnboundedSource$UnboundedToBoundedSourceAdapter$Reader.start(BoundedReadFromUnboundedSource.java:256)
at com.google.cloud.dataflow.worker.WorkerCustomSources$BoundedReaderIterator.start(WorkerCustomSources.java:592)
... 14 more
It seems there's an error in trying to map the Avro Data to my custom class ?
Alternatively, I've tried the following code :
PTransform<PBegin, PCollection<KV<action_states_pkey, String>>> kafka =
KafkaIO.<action_states_pkey, String>read()
.withBootstrapServers("bootstrapserver")
.withTopic("action_states")
.withKeyDeserializerAndCoder((Class)KafkaAvroDeserializer.class, AvroCoder.of(action_states_pkey.class))
.withValueDeserializer(StringDeserializer.class)
.updateConsumerProperties(ImmutableMap.of("schema.registry.url", (Object)"schemaregistry"))
.withMaxNumRecords(5)
.withoutMetadata();
p.apply(kafka);
.apply(Keys.<action_states_pkey>create())
// .apply("ExtractWords", ParDo.of(new DoFn<action_states_pkey, String>() {
// #ProcessElement
// public void processElement(ProcessContext c) {
// action_states_pkey key = c.element();
// c.output(key.getAdId().toString());
// }
// }));
which does not give me any error until i try to print out the data. I have to verify that I'm succesfully reading the data one way or another so my intent here is to log the data in the console. If I uncomment the commented section i get the same error once again:
SEVERE: 2019-09-13T07:53:56.168Z: java.lang.ClassCastException: org.apache.avro.generic.GenericData$Record cannot be cast to my.mudah.beam.test.action_states_pkey
at my.mudah.beam.test.Readkafka$1.processElement(Readkafka.java:151)
Another thing to note is that if I specify :
.updateConsumerProperties(ImmutableMap.of("specific.avro.reader", (Object)"true"))
always gives me an error of
Caused by: org.apache.kafka.common.errors.SerializationException: Error deserializing Avro message for id 443
Caused by: org.apache.kafka.common.errors.SerializationException: Could not find class NAMESPACE.action_states_pkey specified in writer's schema whilst finding reader's schema for a SpecificRecord.
It seems there's something wrong with my approach?
If anyone has any experience reading AVRO data from Kafka Streams using Apache Beam, please do help me out. I greatly appreciate it.
Here's a snapshot of my package with the schema and class in it as well:
package/working path details
Thanks.
public class MyClassKafkaAvroDeserializer extends
AbstractKafkaAvroDeserializer
Your class is extending the AbstractKafkaAvroDeserializer which returns GenericRecord.
You need to convert the GenericRecord to your custom object.
OR
Use SpecificRecord for this as stated in one of the following answers:
/**
* Extends deserializer to support ReflectData.
*
* #param <V>
* value type
*/
public abstract class ReflectKafkaAvroDeserializer<V> extends KafkaAvroDeserializer {
private Schema readerSchema;
private DecoderFactory decoderFactory = DecoderFactory.get();
protected ReflectKafkaAvroDeserializer(Class<V> type) {
readerSchema = ReflectData.get().getSchema(type);
}
#Override
protected Object deserialize(
boolean includeSchemaAndVersion,
String topic,
Boolean isKey,
byte[] payload,
Schema readerSchemaIgnored) throws SerializationException {
if (payload == null) {
return null;
}
int schemaId = -1;
try {
ByteBuffer buffer = ByteBuffer.wrap(payload);
if (buffer.get() != MAGIC_BYTE) {
throw new SerializationException("Unknown magic byte!");
}
schemaId = buffer.getInt();
Schema writerSchema = schemaRegistry.getByID(schemaId);
int start = buffer.position() + buffer.arrayOffset();
int length = buffer.limit() - 1 - idSize;
DatumReader<Object> reader = new ReflectDatumReader(writerSchema, readerSchema);
BinaryDecoder decoder = decoderFactory.binaryDecoder(buffer.array(), start, length, null);
return reader.read(null, decoder);
} catch (IOException e) {
throw new SerializationException("Error deserializing Avro message for id " + schemaId, e);
} catch (RestClientException e) {
throw new SerializationException("Error retrieving Avro schema for id " + schemaId, e);
}
}
}
The above is copied from https://stackoverflow.com/a/39617120/2534090
https://stackoverflow.com/a/42514352/2534090
I have some code that works properly on spring boot prior to 2 and I find it hard to convert it to work with spring boot 2.
Can somebody assist?
public static MutablePropertySources buildPropertySources(String propertyFile, String profile)
{
try
{
Properties properties = new Properties();
YamlPropertySourceLoader loader = new YamlPropertySourceLoader();
// load common properties
PropertySource<?> applicationYamlPropertySource = loader.load("properties", new ClassPathResource(propertyFile), null);
Map<String, Object> source = ((MapPropertySource) applicationYamlPropertySource).getSource();
properties.putAll(source);
// load profile properties
if (null != profile)
{
applicationYamlPropertySource = loader.load("properties", new ClassPathResource(propertyFile), profile);
if (null != applicationYamlPropertySource)
{
source = ((MapPropertySource) applicationYamlPropertySource).getSource();
properties.putAll(source);
}
}
propertySources = new MutablePropertySources();
propertySources.addLast(new PropertiesPropertySource("apis", properties));
}
catch (Exception e)
{
log.error("{} file cannot be found.", propertyFile);
return null;
}
}
public static <T> void handleConfigurationProperties(T bean, MutablePropertySources propertySources) throws BindException
{
ConfigurationProperties configurationProperties = bean.getClass().getAnnotation(ConfigurationProperties.class);
if (null != configurationProperties && null != propertySources)
{
String prefix = configurationProperties.prefix();
String value = configurationProperties.value();
if (null == value || value.isEmpty())
{
value = prefix;
}
PropertiesConfigurationFactory<?> configurationFactory = new PropertiesConfigurationFactory<>(bean);
configurationFactory.setPropertySources(propertySources);
configurationFactory.setTargetName(value);
configurationFactory.bindPropertiesToTarget();
}
}
PropertiesConfigurationFactory doesnt exist anymore and the YamlPropertySourceLoader load method no longer accepts 3 parameters.
(the response is not the same either, when I have tried invoking the new method the response objects were wrapped instead of giving me the direct strings/integers etc...)
The PropertiesConfigurationFactory should be replaced with Binder class.
Binder class
Sample code:-
ConfigurationPropertySource source = new MapConfigurationPropertySource(
loadProperties(resource));
Binder binder = new Binder(source);
return binder.bind("initializr", InitializrProperties.class).get();
We were also using PropertiesConfigurationFactory to bind a POJO to a
prefix of the Environment. In 2.0, a brand new Binder API was
introduced that is more flexible and easier to use. Our binding that
took 10 lines of code could be reduced to 3 simple lines.
YamlPropertySourceLoader:-
Yes, this class has been changed in version 2. It doesn't accept the third parameter profile anymore. The method signature has been changed to return List<PropertySource<?>> rather than PropertySource<?>. If you are expecting single source, please get the first occurrence from the list.
Load the resource into one or more property sources. Implementations
may either return a list containing a single source, or in the case of
a multi-document format such as yaml a source for each document in the
resource.
Since there is no accepted answer yet, i post my full solution, which builds upon the answer from #nationquest:
private ConfigClass loadConfiguration(String path){
MutablePropertySources sources = new MutablePropertySources();
Resource res = new FileSystemResource(path);
PropertiesFactoryBean propFactory = new PropertiesFactoryBean();
propFactory.setLocation(res);
propFactory.setSingleton(false);
// resolve potential references to local environment variables
Properties properties = null;
try {
properties = propFactory.getObject();
for(String p : properties.stringPropertyNames()){
properties.setProperty(p, env.resolvePlaceholders(properties.getProperty(p)));
}
} catch (IOException e) {
e.printStackTrace();
}
sources.addLast(new PropertiesPropertySource("prefix", properties));
ConfigurationPropertySource propertySource = new MapConfigurationPropertySource(properties);
return new Binder(propertySource).bind("prefix", ConfigClass.class).get();
}
The last three lines are the relevant part here.
dirty code but this is how i solved it
https://github.com/mamaorha/easy-wire/tree/master/src/main/java/co/il/nmh/easy/wire/core/utils/properties
This is my first attempt at trying to use a KTable. I have a Kafka Stream that contains Avro serialized objects of type A,B. And this works fine. I can write a Consumer that consumes just fine or a simple KStream that simply counts records.
The B object has a field containing a country code. I'd like to supply that code to a KTable so it can count the number of records that contain a particular country code. To do so I'm trying to convert the stream into a stream of X,Y (or really: country-code, count). Eventually I look at the contents of the table and extract an array of KV pairs.
The code I have (included) always errors out with the following (see the line with 'Caused by'):
2018-07-26 13:42:48.688 [com.findology.tools.controller.TestEventGeneratorController-16d7cd06-4742-402e-a679-898b9ef78c41-StreamThread-1; AssignedStreamsTasks] ERROR -- stream-thread [com.findology.tools.controller.TestEventGeneratorController-16d7c\
d06-4742-402e-a679-898b9ef78c41-StreamThread-1] Failed to process stream task 0_0 due to the following error:
org.apache.kafka.streams.errors.StreamsException: Exception caught in process. taskId=0_0, processor=KSTREAM-SOURCE-0000000000, topic=com.findology.model.traffic.CpaTrackingCallback, partition=0, offset=962649
at org.apache.kafka.streams.processor.internals.StreamTask.process(StreamTask.java:240)
at org.apache.kafka.streams.processor.internals.AssignedStreamsTasks.process(AssignedStreamsTasks.java:94)
at org.apache.kafka.streams.processor.internals.TaskManager.process(TaskManager.java:411)
at org.apache.kafka.streams.processor.internals.StreamThread.processAndMaybeCommit(StreamThread.java:922)
at org.apache.kafka.streams.processor.internals.StreamThread.runOnce(StreamThread.java:802)
at org.apache.kafka.streams.processor.internals.StreamThread.runLoop(StreamThread.java:749)
at org.apache.kafka.streams.processor.internals.StreamThread.run(StreamThread.java:719)
Caused by: org.apache.kafka.streams.errors.StreamsException: A serializer (key: org.apache.kafka.common.serialization.ByteArraySerializer / value: org.apache.kafka.common.serialization.ByteArraySerializer) is not compatible to the actual key or value type (key type: java.lang.Integer / value type: java.lang.Integer). Change the default Serdes in StreamConfig or provide correct Serdes via method parameters.
at org.apache.kafka.streams.processor.internals.SinkNode.process(SinkNode.java:92)
at org.apache.kafka.streams.processor.internals.AbstractProcessorContext.forward(AbstractProcessorContext.java:174)
at org.apache.kafka.streams.kstream.internals.KStreamFilter$KStreamFilterProcessor.process(KStreamFilter.java:43)
at org.apache.kafka.streams.processor.internals.ProcessorNode$1.run(ProcessorNode.java:46)
at org.apache.kafka.streams.processor.internals.StreamsMetricsImpl.measureLatencyNs(StreamsMetricsImpl.java:211)
at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:124)
at org.apache.kafka.streams.processor.internals.AbstractProcessorContext.forward(AbstractProcessorContext.java:174)
at org.apache.kafka.streams.kstream.internals.KStreamTransform$KStreamTransformProcessor.process(KStreamTransform.java:59)
at org.apache.kafka.streams.processor.internals.ProcessorNode$1.run(ProcessorNode.java:46)
at org.apache.kafka.streams.processor.internals.StreamsMetricsImpl.measureLatencyNs(StreamsMetricsImpl.java:211)
at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:124)
at org.apache.kafka.streams.processor.internals.AbstractProcessorContext.forward(AbstractProcessorContext.java:174)
at org.apache.kafka.streams.processor.internals.SourceNode.process(SourceNode.java:80)
at org.apache.kafka.streams.processor.internals.StreamTask.process(StreamTask.java:224)
... 6 more
Caused by: java.lang.ClassCastException: java.lang.Integer cannot be cast to [B
at org.apache.kafka.common.serialization.ByteArraySerializer.serialize(ByteArraySerializer.java:21)
at org.apache.kafka.streams.processor.internals.RecordCollectorImpl.send(RecordCollectorImpl.java:146)
at org.apache.kafka.streams.processor.internals.RecordCollectorImpl.send(RecordCollectorImpl.java:94)
at org.apache.kafka.streams.processor.internals.SinkNode.process(SinkNode.java:87)
... 19 more
And here is the code I'm using. I've omitted certain classes for brevity. Note that I'm not using the Confluent KafkaAvro classes.
private synchronized void createStreamProcessor2() {
if (streams == null) {
try {
Properties props = new Properties();
props.put(StreamsConfig.APPLICATION_ID_CONFIG, getClass().getName());
props.put(StreamsConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapServers);
StreamsConfig config = new StreamsConfig(props);
StreamsBuilder builder = new StreamsBuilder();
Map<String, Object> serdeProps = new HashMap<>();
serdeProps.put("schema.registry.url", schemaRegistryURL);
AvroSerde<CpaTrackingCallback> cpaTrackingCallbackAvroSerde = new AvroSerde<>(schemaRegistryURL);
cpaTrackingCallbackAvroSerde.configure(serdeProps, false);
// This is the key to telling kafka the specific Serde instance to use
// to deserialize the Avro encoded value
KStream<Long, CpaTrackingCallback> stream = builder.stream(CpaTrackingCallback.class.getName(),
Consumed.with(Serdes.Long(), cpaTrackingCallbackAvroSerde));
// provide a way to convert CpsTrackicking... info into just country codes
// (Long, CpaTrackingCallback) -> (countryCode:Integer, placeHolder:Long)
TransformerSupplier<Long, CpaTrackingCallback, KeyValue<Integer, Long>> transformer = new TransformerSupplier<Long, CpaTrackingCallback, KeyValue<Integer, Long>>() {
#Override
public Transformer<Long, CpaTrackingCallback, KeyValue<Integer, Long>> get() {
return new Transformer<Long, CpaTrackingCallback, KeyValue<Integer, Long>>() {
#Override
public void init(ProcessorContext context) {
// Not doing Punctuate so no need to store context
}
#Override
public KeyValue<Integer, Long> transform(Long key, CpaTrackingCallback value) {
return new KeyValue(value.getCountryCode(), 1);
}
#Override
public KeyValue<Integer, Long> punctuate(long timestamp) {
return null;
}
#Override
public void close() {
}
};
}
};
KTable<Integer, Long> countryCounts = stream.transform(transformer).groupByKey() //
.count(Materialized.as("country-counts"));
streams = new KafkaStreams(builder.build(), config);
Runtime.getRuntime().addShutdownHook(new Thread(streams::close));
streams.cleanUp();
streams.start();
try {
countryCountsView = waitUntilStoreIsQueryable("country-counts", QueryableStoreTypes.keyValueStore(),
streams);
}
catch (InterruptedException e) {
log.warn("Interrupted while waiting for query store to become available", e);
}
}
catch (Exception e) {
log.error(e);
}
}
}
The bare groupByKey() method on KStream uses the default serializer/deserializer (which you haven't set). Use the method groupByKey(Serialized<K,V> serialized), as in:
.groupByKey(Serialized.with(Serdes.Integer(), Serdes.Long()))
Also note, what you do in your custom TransformerSupplier, you can do simply with a KStream.map call.
I have implmented the follwing class and implemented a search method get list of Object for the given type of T using a REST call.
public class AmapiService<T extends Resource> implements RestService<T> {
#Override
public List<T> search( MultivaluedMap<String, String> aQueryParams ) {
MultivaluedMap<String, String> lQueryParams = aQueryParams;
Client lClient = buildClient();
WebTarget ltarget = lClient.target( iRestDriver.target( aApiUrl ).getUri() );
for ( String lKey : lQueryParams.keySet() ) {
ltarget = ltarget.queryParam( lKey, lQueryParams.getFirst( lKey ) );
}
List<T> lObjects = null;
lObjects = ltarget.request( MediaType.APPLICATION_JSON ).get( new GenericType<List<T>>() {
} );
return lObjects;
}
}
this is how the instance is created for the class and search method call.
AmapiService<RFQDefinition> RFQService =
new AmapiService<RFQDefinition>();
List<RFQDefinition> qfq = RFQService.search( lQueryParam );
when i run this im getting the following error
May 07, 2018 1:48:14 PM org.glassfish.jersey.message.internal.ReaderInterceptorExecutor$TerminalReaderInterceptor aroundReadFrom
SEVERE: MessageBodyReader not found for media type=application/json, type=interface java.util.List, genericType=java.util.List<T>.
That means the RFQDefinition does not set to the T in the new GenericType<List<T>>().
how can i set the type from the T in the AmapiService<T extends Resource> to new GenericType<List<T>>()
if i define it new GenericType<List<RFQDefinition>>() Instead of T its working
Your use of generics does actually not serve a purpose here.
There's no true generic implementation as your response is based on the parameter you give to the service you consume (.get( new GenericType<List<T>>(){})). The type parameter doesn't help this because the code is being invoked by the jax-rs implementation, which does not go through static linking.
In other words, no code is going to compile and run against your class/method using the parameter:
//this is not a valid use-case:
List<ActualType> res = search(aQueryParams);
A simple translation for this can be: Java generics don't make it into the REST API. You have to statically use the Java type for the model your API is supposed to return (just as you do with new GenericType<List<RFQDefinition>>()).
It's not related to generic.
The server complains that it does not have any MessageBodyReader to read the application/json body.
Normally, you can add Jackson as json decode/encode for your application
Read this:
https://jersey.github.io/documentation/latest/media.html#json.jackson
I have changed the Search method Implementation like this, now it is working.
iClass you can take this as a parameter eg: RFQDefinition.class
ObjectMapper mapper = new ObjectMapper();
JavaType type = mapper.getTypeFactory().constructCollectionType( List.class, iClass );
List<T> lObjects = null;
JsonArray lResponse = ltarget.request( MediaType.APPLICATION_JSON ).get( JsonArray.class );
try {
lObjects = mapper.readValue( lResponse.toString(), type );
} catch ( JsonParseException e ) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch ( JsonMappingException e ) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch ( IOException e ) {
// TODO Auto-generated catch block
e.printStackTrace();
}
I get the following problem when trying to display a list of items. For each item, I have to display an image which is dynamically loaded via a Wicket WebResource. The items are loaded step by step — 50 at a time — upon user scrolling, using an Ajax scroll.
[ERROR] 2011-04-19 09:58:18,000 btpool0-1 org.apache.wicket.RequestCycle.logRuntimeException (host=, request=, site=):
org.apache.wicket.WicketRuntimeException: component documentList:scroller:batchElem:666:content:item:3:batchItemContent:linkToPreview:imageThumbnail not found on page com.webapp.document.pages.DocumentListPage[id = 1]
listener interface = [RequestListenerInterface name=IResourceListener, method=public abstract void org.apache.wicket.IResourceListener.onResourceRequested()]
org.apache.wicket.protocol.http.request.InvalidUrlException: org.apache.wicket.WicketRuntimeException: component documentList:scroller:batchElem:666:content:item:3:batchItemContent:linkToPreview:imageThumbnail
not found on page com.webapp.document.pages.DocumentListPage[id = 1] listener interface = [RequestListenerInterface name=IResourceListener, method=public abstract void org.apache.wicket.IResourceListener.onResourceRequested()]
at org.apache.wicket.protocol.http.WebRequestCycleProcessor.resolve(WebRequestCycleProcessor.java:262)
at org.apache.wicket.RequestCycle.step(RequestCycle.java:1310)
at org.apache.wicket.RequestCycle.steps(RequestCycle.java:1428)
at org.apache.wicket.RequestCycle.request(RequestCycle.java:545)
at org.apache.wicket.protocol.http.WicketFilter.doGet(WicketFilter.java:479)
at org.apache.wicket.protocol.http.WicketFilter$$EnhancerByGuice$$51619816.CGLIB$doGet$6()
at org.apache.wicket.protocol.http.WicketFilter$$EnhancerByGuice$$51619816$$FastClassByGuice$$6d42bf5d.invoke()
at com.google.inject.internal.cglib.proxy.MethodProxy.invokeSuper(MethodProxy.java:228)
at com.google.inject.internal.InterceptorStackCallback$InterceptedMethodInvocation.proceed(InterceptorStackCallback.java:64)
at com.freiheit.monitoring.PerformanceMonitoringMethodInterceptor.invoke(PerformanceMonitoringMethodInterceptor.java:115)
at com.google.inject.internal.InterceptorStackCallback$InterceptedMethodInvocation.proceed(InterceptorStackCallback.java:64)
at com.google.inject.internal.InterceptorStackCallback.intercept(InterceptorStackCallback.java:44)
at org.apache.wicket.protocol.http.WicketFilter$$EnhancerByGuice$$51619816.doGet()
at org.apache.wicket.protocol.http.WicketFilter.doFilter(WicketFilter.java:312)
at org.apache.wicket.protocol.http.WicketFilter$$EnhancerByGuice$$51619816.CGLIB$doFilter$4()
How can this problem be solved?
Here is the part of the code responsible for adding the image:
previewLink.add(createThumbnailSmall("imageThumbnail", documentModel));
in
createThumbnailSmall(final String id, final IModel<BaseDocument> documentModel) {
// thumbnailResource is an object that contains the path of the image
if (thumbnailResource != null) {
final WebResource resource = getWebResource(thumbnailResource);
final Image image = new Image(id, resource);
return image;
}
return new InvisibleContainer(id);
}
WebResource getWebResource(final DocumentResource documentResource) {
return new WebResource() {
private static final long serialVersionUID = 1L;
#Override
public IResourceStream getResourceStream() {
return new BaseStreamResource(documentResource);
}
};
}
where BaseStreamResource is the following:
public class BaseStreamResource extends AbstractResourceStream {
private InputStream _fileInputStream = null;
private DocumentResource _resource = null;
public BaseStreamResource(final DocumentResource documentResource) {
_resource = documentResource;
}
#Override
public InputStream getInputStream() throws ResourceStreamNotFoundException {
if (_fileInputStream == null) {
try {
if (_resource == null) {
throw new ResourceStreamNotFoundException("Resource was null");
}
_fileInputStream = _resource.getFileInputStream();
} catch (final ResourceNotAvailableException ex) {
throw new ResourceStreamNotFoundException(ex);
}
}
return _fileInputStream;
}
In HTML:
<a wicket:id="linkToPreview" href="#">
<img wicket:id="imageThumbnail" alt="Attachment"></img></a>
The code added hasn't really added any clues for me, but maybe I can help narrow it down a bit anyway.
The stacktrace includes a reference to com.webapp.document.pages.DocumentListPage, which is likely calling some of the code you've posted. The error indicates a bad url, so debugging into that class, adding debug prints, and looking at the values of any field containing a url might be worthwhile.
It might even help to modify the code in DocumentListPage (maybe temporarily for debugging) to catch org.apache.wicket.protocol.http.request.InvalidUrlException and adding debugging prints specifically when the exception is caught.
This isn't really an answer, but it's too big for a comment, and maybe it'll help you get closer to an answer.
The following solution solved the problem:
- extend WebResource class
- add extended class as a resource to application shared resources
Ex:
public class MyWebResource extends WebResource {
final ValueMap map = new ValueMap();
#Override
public IResourceStream getResourceStream() {
String fileName = getFileName();
File file = new File(basePath, fileName);
if (!file.exists()) {
LOG.error("File does not exist: " + file);
throw new IllegalStateException("File does not exist: " + file);
}
return new FileResourceStream(file);
}
public final void addResource() {
Application.get().getSharedResources().add(getClass().getName(), this);
}
protected String getFileName() {
return getParameters().getString("id");
}
public final String urlFor(final String fileName) {
final ResourceReference resourceReference = new ResourceReference(getClass().getName());
final String encodedValue = WicketURLEncoder.QUERY_INSTANCE.encode(fileName);
map.add("id", encodedValue);
final CharSequence result = RequestCycle.get().urlFor(resourceReference, map);
if (result == null) {
throw new IllegalStateException("The resource was not added! "
+ "In your Application class add the following line:"
+ "MyConcreteResource.INSTANCE.addResource()");
}
String absoluteUrl = RequestUtils.toAbsolutePath(result.toString());
return absoluteUrl;
}
}
In Application class, in init(), I have added MyWebResource to shared resources:
public void init() {
...
new MyWebResource().addResource();
...
}