Found interface org.apache.hadoop.mapreduce.TaskAttemptContext - java

Haven't seen a solution to my particular problem so far. It isn't working at least. Its driving me pretty crazy. This particular combo doesn't seem to have a lot in the google space. My error occurs as the job does into the mapper from what I can tell. The input to this job are avro schema'd output that is compressed with deflate though I tried uncompressed as well.
Avro: 1.7.7
Hadoop: 2.4.1
I am getting this error and I'm not sure why. Here is my job, mapper, and reduce. The error is happening when the mapper comes in.
Sample uncompressed Avro input file (StockReport.SCHEMA is defined this way)
{"day": 3, "month": 2, "year": 1986, "stocks": [{"symbol": "AAME", "timestamp": 507833213000, "dividend": 10.59}]}
Job
#Override
public int run(String[] strings) throws Exception {
Job job = Job.getInstance();
job.setJobName("GenerateGraphsJob");
job.setJarByClass(GenerateGraphsJob.class);
configureJob(job);
int resultCode = job.waitForCompletion(true) ? 0 : 1;
return resultCode;
}
private void configureJob(Job job) throws IOException {
try {
Configuration config = getConf();
Path inputPath = ConfigHelper.getChartInputPath(config);
Path outputPath = ConfigHelper.getChartOutputPath(config);
job.setInputFormatClass(AvroKeyInputFormat.class);
AvroKeyInputFormat.addInputPath(job, inputPath);
AvroJob.setInputKeySchema(job, StockReport.SCHEMA$);
job.setMapperClass(StockAverageMapper.class);
job.setCombinerClass(StockAverageCombiner.class);
job.setReducerClass(StockAverageReducer.class);
FileOutputFormat.setOutputPath(job, outputPath);
} catch (IOException | ClassCastException e) {
LOG.error("An job error has occurred.", e);
}
}
Mapper:
public class StockAverageMapper extends
Mapper<AvroKey<StockReport>, NullWritable, StockYearSymbolKey, StockReport> {
private static Logger LOG = LoggerFactory.getLogger(StockAverageMapper.class);
private final StockReport stockReport = new StockReport();
private final StockYearSymbolKey stockKey = new StockYearSymbolKey();
#Override
protected void map(AvroKey<StockReport> inKey, NullWritable ignore, Context context)
throws IOException, InterruptedException {
try {
StockReport inKeyDatum = inKey.datum();
for (Stock stock : inKeyDatum.getStocks()) {
updateKey(inKeyDatum, stock);
updateValue(inKeyDatum, stock);
context.write(stockKey, stockReport);
}
} catch (Exception ex) {
LOG.debug(ex.toString());
}
}
Schema for map output key:
{
"namespace": "avro.model",
"type": "record",
"name": "StockYearSymbolKey",
"fields": [
{
"name": "year",
"type": "int"
},
{
"name": "symbol",
"type": "string"
}
]
}
Stack trace:
java.lang.Exception: java.lang.IncompatibleClassChangeError: Found interface org.apache.hadoop.mapreduce.TaskAttemptContext, but class was expected
at org.apache.hadoop.mapred.LocalJobRunner$Job.runTasks(LocalJobRunner.java:462)
at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:522)
Caused by: java.lang.IncompatibleClassChangeError: Found interface org.apache.hadoop.mapreduce.TaskAttemptContext, but class was expected
at org.apache.avro.mapreduce.AvroKeyInputFormat.createRecordReader(AvroKeyInputFormat.java:47)
at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.<init>(MapTask.java:492)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:735)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340)
at org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:243)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Edit: Not that it matters but I'm working to reduce this to data I can create JFreeChart outputs from. Not getting through the mapper so that shouldn't be related.

The problem is that org.apache.hadoop.mapreduce.TaskAttemptContext was a class in Hadoop 1 but became an interface in Hadoop 2.
This is one of the reasons why libraries which depend on the Hadoop libs need to have separately compiled jarfiles for Hadoop 1 and Hadoop 2. Based on your stack trace, it appears that somehow you got a Hadoop1-compiled Avro jarfile, despite running with Hadoop 2.4.1.
The download mirrors for Avro provide nice separate downloadables for avro-mapred-1.7.7-hadoop1.jar vs avro-mapred-1.7.7-hadoop2.jar.

The problem is that Avro 1.7.7 supports 2 versions of Hadoop and hence depends on both Hadoop versions. And by default Avro 1.7.7 jars dependend on old Hadoop version.
To build with Avro 1.7.7 with Hadoop2 just add extra classifier line to maven dependencies:
<dependency>
<groupId>org.apache.avro</groupId>
<artifactId>avro-mapred</artifactId>
<version>1.7.7</version>
<classifier>hadoop2</classifier>
</dependency>
This will tell maven to search for avro-mapred-1.7.7-hadoop2.jar, not avro-mapred-1.7.7.jar
Same applicable for Avro 1.7.4 and above

Related

Java - Using Jackson's JsonNode to parse nested JSON values results in NullPointerException

I've been trying for a while to get this parser to work correctly but I keep getting the same issue every time. The code is very insistent that the value is null.
The issue is this, I have this call to the Google Books API which returns a fairly large JSON response. The values I need from this are nested and are proving hard to get at.
This is an example of the JSON in question. What I want to get to are a few of the books' identifying information which can be found under volumeInfo.
{
"kind": "books#volumes",
"totalItems": 1,
"items": [
{
"kind": "books#volume",
"id": "zkqMEAAAQBAJ",
"etag": "4V9ue/R0aw4",
"selfLink": "https://www.googleapis.com/books/v1/volumes/zkqMEAAAQBAJ",
"volumeInfo": {
"title": "The Song of the Cell",
"subtitle": "An Exploration of Medicine and the New Human",
"authors": [
"Siddhartha Mukherjee"
],
"publisher": "Simon and Schuster",
"publishedDate": "2022-10-25",
"description": "Presenting revelatory and exhilarating stories of scientists, doctors, and the patients whose lives may be saved by their work, the author draws on his own experience as a researcher, doctor, and prolific reader to explore how the discovery of cells created a new kind of medicine based on the therapeutic manipulation of cells.",
"industryIdentifiers": [
{
"type": "ISBN_13",
"identifier": "9781982117351"
},
{
"type": "ISBN_10",
"identifier": "1982117354"
}
],
"readingModes": {
"text": false,
"image": false
},
"pageCount": 496,
"printType": "BOOK",
"categories": [
"History"
],
"averageRating": 5,
"ratingsCount": 2,
"maturityRating": "NOT_MATURE",
"allowAnonLogging": false,
"contentVersion": "0.6.2.0.preview.0",
"panelizationSummary": {
"containsEpubBubbles": false,
"containsImageBubbles": false
},
"imageLinks": {
"smallThumbnail": "http://books.google.com/books/content?id=zkqMEAAAQBAJ&printsec=frontcover&img=1&zoom=5&edge=curl&source=gbs_api",
"thumbnail": "http://books.google.com/books/content?id=zkqMEAAAQBAJ&printsec=frontcover&img=1&zoom=1&edge=curl&source=gbs_api"
},
"language": "en",
"previewLink": "http://books.google.com/books?id=zkqMEAAAQBAJ&printsec=frontcover&dq=isbn:9781982117351&hl=&cd=1&source=gbs_api",
"infoLink": "http://books.google.com/books?id=zkqMEAAAQBAJ&dq=isbn:9781982117351&hl=&source=gbs_api",
"canonicalVolumeLink": "https://books.google.com/books/about/The_Song_of_the_Cell.html?hl=&id=zkqMEAAAQBAJ"
},
"saleInfo": {
"country": "US",
"saleability": "NOT_FOR_SALE",
"isEbook": false
},
"accessInfo": {
"country": "US",
"viewability": "PARTIAL",
"embeddable": true,
"publicDomain": false,
"textToSpeechPermission": "ALLOWED_FOR_ACCESSIBILITY",
"epub": {
"isAvailable": false
},
"pdf": {
"isAvailable": false
},
"webReaderLink": "http://play.google.com/books/reader?id=zkqMEAAAQBAJ&hl=&source=gbs_api",
"accessViewStatus": "SAMPLE",
"quoteSharingAllowed": false
},
"searchInfo": {
"textSnippet": "Presenting revelatory and exhilarating stories of scientists, doctors, and the patients whose lives may be saved by their work, the author draws on his own experience as a researcher, doctor, and prolific reader to explore how the discovery ..."
}
}
]
}
My code is fairly simple. I've tried a few different combinations of JsonNode, JsonFactory and JsonParser but nothing has seemed to work like it should. Some of the resources I looked at while trying to fix this were this article on Baeldung, this tutorial and more Stack Overflow posts than I can list. The API call itself is not the problem. It works just fine.
This is standard Java. No Spring or other frameworks.
ObjectMapper mapper = new ObjectMapper();
URL url= new URL(setURL(bookInfo, infoType));
HttpURLConnection connection = (HttpURLConnection) url.openConnection();
InputStream responseStream = connection.getInputStream();
JsonNode volumeNode = mapper.readTree(responseStream);
GoogleVolume testVolume= new GoogleVolume();
testVolume.setTitle(volumeNode.get("title").textValue());
Here is the stackTrace:
Exception in thread "JavaFX Application Thread" java.lang.RuntimeException: java.lang.reflect.InvocationTargetException
at javafx.fxml#19/javafx.fxml.FXMLLoader$MethodHandler.invoke(FXMLLoader.java:1857)
at javafx.fxml#19/javafx.fxml.FXMLLoader$ControllerMethodEventHandler.handle(FXMLLoader.java:1724)
at javafx.base#19/com.sun.javafx.event.CompositeEventHandler.dispatchBubblingEvent(CompositeEventHandler.java:86)
at javafx.base#19/com.sun.javafx.event.EventHandlerManager.dispatchBubblingEvent(EventHandlerManager.java:234)
at javafx.base#19/com.sun.javafx.event.EventHandlerManager.dispatchBubblingEvent(EventHandlerManager.java:191)
at javafx.base#19/com.sun.javafx.event.CompositeEventDispatcher.dispatchBubblingEvent(CompositeEventDispatcher.java:59)
at javafx.base#19/com.sun.javafx.event.BasicEventDispatcher.dispatchEvent(BasicEventDispatcher.java:58)
at javafx.base#19/com.sun.javafx.event.EventDispatchChainImpl.dispatchEvent(EventDispatchChainImpl.java:114)
at javafx.base#19/com.sun.javafx.event.BasicEventDispatcher.dispatchEvent(BasicEventDispatcher.java:56)
at javafx.base#19/com.sun.javafx.event.EventDispatchChainImpl.dispatchEvent(EventDispatchChainImpl.java:114)
at javafx.base#19/com.sun.javafx.event.BasicEventDispatcher.dispatchEvent(BasicEventDispatcher.java:56)
at javafx.base#19/com.sun.javafx.event.EventDispatchChainImpl.dispatchEvent(EventDispatchChainImpl.java:114)
at javafx.base#19/com.sun.javafx.event.EventUtil.fireEventImpl(EventUtil.java:74)
at javafx.base#19/com.sun.javafx.event.EventUtil.fireEvent(EventUtil.java:49)
at javafx.base#19/javafx.event.Event.fireEvent(Event.java:198)
at javafx.graphics#19/javafx.scene.Node.fireEvent(Node.java:8923)
at javafx.controls#19/javafx.scene.control.Button.fire(Button.java:203)
at javafx.controls#19/com.sun.javafx.scene.control.behavior.ButtonBehavior.mouseReleased(ButtonBehavior.java:207)
at javafx.controls#19/com.sun.javafx.scene.control.inputmap.InputMap.handle(InputMap.java:274)
at javafx.base#19/com.sun.javafx.event.CompositeEventHandler$NormalEventHandlerRecord.handleBubblingEvent(CompositeEventHandler.java:247)
at javafx.base#19/com.sun.javafx.event.CompositeEventHandler.dispatchBubblingEvent(CompositeEventHandler.java:80)
at javafx.base#19/com.sun.javafx.event.EventHandlerManager.dispatchBubblingEvent(EventHandlerManager.java:234)
at javafx.base#19/com.sun.javafx.event.EventHandlerManager.dispatchBubblingEvent(EventHandlerManager.java:191)
at javafx.base#19/com.sun.javafx.event.CompositeEventDispatcher.dispatchBubblingEvent(CompositeEventDispatcher.java:59)
at javafx.base#19/com.sun.javafx.event.BasicEventDispatcher.dispatchEvent(BasicEventDispatcher.java:58)
at javafx.base#19/com.sun.javafx.event.EventDispatchChainImpl.dispatchEvent(EventDispatchChainImpl.java:114)
at javafx.base#19/com.sun.javafx.event.BasicEventDispatcher.dispatchEvent(BasicEventDispatcher.java:56)
at javafx.base#19/com.sun.javafx.event.EventDispatchChainImpl.dispatchEvent(EventDispatchChainImpl.java:114)
at javafx.base#19/com.sun.javafx.event.BasicEventDispatcher.dispatchEvent(BasicEventDispatcher.java:56)
at javafx.base#19/com.sun.javafx.event.EventDispatchChainImpl.dispatchEvent(EventDispatchChainImpl.java:114)
at javafx.base#19/com.sun.javafx.event.EventUtil.fireEventImpl(EventUtil.java:74)
at javafx.base#19/com.sun.javafx.event.EventUtil.fireEvent(EventUtil.java:54)
at javafx.base#19/javafx.event.Event.fireEvent(Event.java:198)
at javafx.graphics#19/javafx.scene.Scene$MouseHandler.process(Scene.java:3894)
at javafx.graphics#19/javafx.scene.Scene.processMouseEvent(Scene.java:1887)
at javafx.graphics#19/javafx.scene.Scene$ScenePeerListener.mouseEvent(Scene.java:2620)
at javafx.graphics#19/com.sun.javafx.tk.quantum.GlassViewEventHandler$MouseEventNotification.run(GlassViewEventHandler.java:411)
at javafx.graphics#19/com.sun.javafx.tk.quantum.GlassViewEventHandler$MouseEventNotification.run(GlassViewEventHandler.java:301)
at java.base/java.security.AccessController.doPrivileged(AccessController.java:399)
at javafx.graphics#19/com.sun.javafx.tk.quantum.GlassViewEventHandler.lambda$handleMouseEvent$2(GlassViewEventHandler.java:450)
at javafx.graphics#19/com.sun.javafx.tk.quantum.QuantumToolkit.runWithoutRenderLock(QuantumToolkit.java:424)
at javafx.graphics#19/com.sun.javafx.tk.quantum.GlassViewEventHandler.handleMouseEvent(GlassViewEventHandler.java:449)
at javafx.graphics#19/com.sun.glass.ui.View.handleMouseEvent(View.java:551)
at javafx.graphics#19/com.sun.glass.ui.View.notifyMouse(View.java:937)
at javafx.graphics#19/com.sun.glass.ui.gtk.GtkApplication._runLoop(Native Method)
at javafx.graphics#19/com.sun.glass.ui.gtk.GtkApplication.lambda$runLoop$11(GtkApplication.java:316)
at java.base/java.lang.Thread.run(Thread.java:833)
Caused by: java.lang.reflect.InvocationTargetException
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:568)
at com.sun.javafx.reflect.Trampoline.invoke(MethodUtil.java:77)
at jdk.internal.reflect.GeneratedMethodAccessor2.invoke(Unknown Source)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:568)
at javafx.base#19/com.sun.javafx.reflect.MethodUtil.invoke(MethodUtil.java:275)
at javafx.fxml#19/com.sun.javafx.fxml.MethodHelper.invoke(MethodHelper.java:84)
at javafx.fxml#19/javafx.fxml.FXMLLoader$MethodHandler.invoke(FXMLLoader.java:1854)
... 46 more
Caused by: java.lang.NullPointerException: Cannot invoke "com.fasterxml.jackson.databind.JsonNode.textValue()" because the return value of "com.fasterxml.jackson.databind.JsonNode.get(String)" is null
at com.libary#0.0.1-SNAPSHOT/com.library.models.GoogleBooksService.getBookData(GoogleBooksService.java:49)
at com.libary#0.0.1-SNAPSHOT/com.library.controllers.LoginController.attemptLogin(LoginController.java:33)
I apologize if the details are a bit lite, but if any information is needed then I'll be happy to provide it. Any input on why this is happening would be greatly appreciated!
According to the JSON structure, the first items must be fetched, and since it is a list, the first items must be fetched based on index. 
String title = volumeNode.get("items").get(0).get("volumeInfo").get("title").textValue();
Well, "title" is not a property at the top level of your JSON so when it searches for such property it finds nothing and returns null. Also it seems to me that your code may be a bit too complex. You didn't provide your GoogleVolume class. However by its structure it should be the same as the structure of your JSON. What I would suggest that you use ObjectMapper method: public T readValue(String content,
Class valueType)
throws JsonProcessingException,
JsonMappingException
where the second parameter would be your GoogleVolume.class. Or You can try use Map.class to parse your JSON String to instance of Map<String, Object>. Also if I may suggest to simplify your code even further I wrote my own Open Source library called MgntUtils that provides (among other utils) a very simple Http client and JsonUtil for simple cases of serializing/parsing JSON. your code with use of my library may look something like this:
HttpClient httpClient = new HttpClient();
String jsonResponse = httpClient.sendHttpRequest(urlStr, HttpClient.HttpMethod.GET);
GoogleVolume volume = JsonUtils.readObjectFromJsonString(jsonResponse, GoogleVolume.class);
For simplicity I omitted exception handling. Here is the Javadoc for MgntUtils library. The library can be obtained as Maven artifact from Maven Central or from Github (including Javadoc and source code)

Play Framework unable to read environment variable in Cloud Foundry

I am trying to read connection URL environment variable of PostgreSQL service inside my application.conf as follows:
db.default.driver="org.postgresql.Driver"
db.default.url=${?cloud.services.postgresql.connection.url}
My VCAP_SERVICES is as follows
{
"postgresql": [
{
"binding_name": null,
"credentials": {
"dbname": "sample-db",
"end_points": [
{
"host": "x.x.x.x",
"network_id": "SF",
"port": "44980"
}
],
"hostname": "x.x.x.x",
"password": "sample-password",
"port": "44980",
"ports": {
"5432/tcp": "44980"
},
"uri": "postgres://sample-user:sample-password#x.x.x.x:44980/sample-db",
"username": "sample-user"
},
"instance_name": "postgresql",
"label": "postgresql",
"name": "postgresql",
"plan": "v9.6-dev",
"provider": null,
"syslog_drain_url": null,
"tags": [
"postgresql",
"relational"
],
"volume_mounts": []
}
]
}
I am following this article.
However the database won't configure and the root was Configuration error[jdbcUrl is required with driverClassName.]. Full exception dump below.
play.api.Configuration$$anon$1: Configuration error[Cannot initialize to database [default]]
at play.api.Configuration$.configError(Configuration.scala:155)
at play.api.Configuration.reportError(Configuration.scala:394)
at play.api.db.DefaultDBApi.$anonfun$initialize$1(DefaultDBApi.scala:76)
at scala.collection.immutable.List.foreach(List.scala:333)
at play.api.db.DefaultDBApi.initialize(DefaultDBApi.scala:68)
at play.api.db.DBApiProvider.get$lzycompute(DBModule.scala:92)
at play.api.db.DBApiProvider.get(DBModule.scala:77)
at play.api.db.DBApiProvider.get(DBModule.scala:59)
at com.google.inject.internal.ProviderInternalFactory.provision(ProviderInternalFactory.java:85)
at com.google.inject.internal.BoundProviderFactory.provision(BoundProviderFactory.java:77)
at com.google.inject.internal.ProviderInternalFactory.circularGet(ProviderInternalFactory.java:59)
at com.google.inject.internal.BoundProviderFactory.get(BoundProviderFactory.java:61)
at com.google.inject.internal.SingleFieldInjector.inject(SingleFieldInjector.java:52)
at com.google.inject.internal.MembersInjectorImpl.injectMembers(MembersInjectorImpl.java:147)
at com.google.inject.internal.MembersInjectorImpl.injectAndNotify(MembersInjectorImpl.java:101)
at com.google.inject.internal.MembersInjectorImpl.injectMembers(MembersInjectorImpl.java:71)
at com.google.inject.internal.InjectorImpl.injectMembers(InjectorImpl.java:1055)
at com.google.inject.util.Providers$GuicifiedProviderWithDependencies.initialize(Providers.java:154)
at com.google.inject.util.Providers$GuicifiedProviderWithDependencies$$FastClassByGuice$$2a7177aa.invoke(<generated>)
at com.google.inject.internal.SingleMethodInjector$1.invoke(SingleMethodInjector.java:51)
at com.google.inject.internal.SingleMethodInjector.inject(SingleMethodInjector.java:85)
at com.google.inject.internal.MembersInjectorImpl.injectMembers(MembersInjectorImpl.java:147)
at com.google.inject.internal.MembersInjectorImpl.injectAndNotify(MembersInjectorImpl.java:101)
at com.google.inject.internal.Initializer$InjectableReference.get(Initializer.java:245)
at com.google.inject.internal.Initializer.injectAll(Initializer.java:140)
at com.google.inject.internal.InternalInjectorCreator.injectDynamically(InternalInjectorCreator.java:178)
at com.google.inject.internal.InternalInjectorCreator.build(InternalInjectorCreator.java:111)
at com.google.inject.Guice.createInjector(Guice.java:87)
at com.google.inject.Guice.createInjector(Guice.java:78)
at play.api.inject.guice.GuiceBuilder.injector(GuiceInjectorBuilder.scala:200)
at play.inject.guice.GuiceBuilder.injector(GuiceBuilder.java:211)
at play.inject.guice.GuiceApplicationBuilder.build(GuiceApplicationBuilder.java:121)
at play.inject.guice.GuiceApplicationLoader.load(GuiceApplicationLoader.java:32)
at play.api.ApplicationLoader$JavaApplicationLoaderAdapter$1.load(ApplicationLoader.scala:181)
at play.core.server.DevServerStart$$anon$1.$anonfun$reload$3(DevServerStart.scala:190)
at play.utils.Threads$.withContextClassLoader(Threads.scala:22)
at play.core.server.DevServerStart$$anon$1.reload(DevServerStart.scala:182)
at play.core.server.DevServerStart$$anon$1.get(DevServerStart.scala:142)
at play.core.server.AkkaHttpServer.handleRequest(AkkaHttpServer.scala:301)
at play.core.server.AkkaHttpServer.$anonfun$createServerBinding$1(AkkaHttpServer.scala:191)
at akka.stream.impl.fusing.MapAsync$$anon$30.onPush(Ops.scala:1285)
at akka.stream.impl.fusing.GraphInterpreter.processPush(GraphInterpreter.scala:541)
at akka.stream.impl.fusing.GraphInterpreter.execute(GraphInterpreter.scala:423)
at akka.stream.impl.fusing.GraphInterpreterShell.runBatch(ActorGraphInterpreter.scala:625)
at akka.stream.impl.fusing.GraphInterpreterShell$AsyncInput.execute(ActorGraphInterpreter.scala:502)
at akka.stream.impl.fusing.GraphInterpreterShell.processEvent(ActorGraphInterpreter.scala:600)
at akka.stream.impl.fusing.ActorGraphInterpreter.akka$stream$impl$fusing$ActorGraphInterpreter$$processEvent(ActorGraphInterpreter.scala:769)
at akka.stream.impl.fusing.ActorGraphInterpreter$$anonfun$receive$1.applyOrElse(ActorGraphInterpreter.scala:784)
at akka.actor.Actor.aroundReceive(Actor.scala:535)
at akka.actor.Actor.aroundReceive$(Actor.scala:533)
at akka.stream.impl.fusing.ActorGraphInterpreter.aroundReceive(ActorGraphInterpreter.scala:691)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:575)
at akka.actor.ActorCell.invoke(ActorCell.scala:545)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:270)
at akka.dispatch.Mailbox.run(Mailbox.scala:231)
at akka.dispatch.Mailbox.exec(Mailbox.scala:243)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:157)
Caused by: play.api.Configuration$$anon$1: Configuration error[jdbcUrl is required with driverClassName.]
at play.api.Configuration$.configError(Configuration.scala:155)
at play.api.Configuration.reportError(Configuration.scala:394)
at play.api.db.HikariCPConnectionPool.create(HikariCPModule.scala:70)
at play.api.db.PooledDatabase.createDataSource(Databases.scala:249)
at play.api.db.DefaultDatabase.dataSource$lzycompute(Databases.scala:141)
at play.api.db.DefaultDatabase.dataSource(Databases.scala:139)
at play.api.db.DefaultDBApi.$anonfun$initialize$1(DefaultDBApi.scala:72)
... 57 common frames omitted
Caused by: java.lang.IllegalArgumentException: jdbcUrl is required with driverClassName.
at com.zaxxer.hikari.HikariConfig.validate(HikariConfig.java:1000)
at play.api.db.HikariCPConfig.toHikariConfig(HikariCPModule.scala:140)
at play.api.db.HikariCPConnectionPool.$anonfun$create$1(HikariCPModule.scala:57)
at scala.util.Try$.apply(Try.scala:210)
at play.api.db.HikariCPConnectionPool.create(HikariCPModule.scala:54)
... 61 common frames omitted
Using Play Framework 2.8.
It looks like Cloud Foundry support for Play Framework has been broken since Play 2.5. Cloud Foundry's Java Buildpack removed support for it which in turn invalidates the corresponding documentation in Play Framework which refers to configuration keys prefixed by {?cloud.services....}.
I ended writing code to parse VCAP_SERVICES myself and insert it in the application loader, following the documentation to create your own application loader:
package com.example.config;
import com.fasterxml.jackson.databind.JsonNode;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.typesafe.config.Config;
import com.typesafe.config.ConfigFactory;
import play.ApplicationLoader;
import play.inject.guice.GuiceApplicationBuilder;
import play.inject.guice.GuiceApplicationLoader;
public class MyApplicationLoader extends GuiceApplicationLoader {
private static final Logger LOGGER = Logger.getLogger(MyApplicationLoader.class.getCanonicalName());
#Override
public GuiceApplicationBuilder builder(ApplicationLoader.Context context) {
// https://www.programcreek.com/scala/play.api.Configuration
// https://www.playframework.com/documentation/2.8.x/JavaDependencyInjection#Advanced:-Extending-the-GuiceApplicationLoader
Config cloudConfig = parseCloudFoundryEnvironmentConfig(context);
return initialBuilder
.in(context.environment())
.loadConfig(cloudConfig.withFallback(context.initialConfig()))
.overrides(overrides(context));
}
static Config parseCloudFoundryEnvironmentConfig(ApplicationLoader.Context context) {
final ObjectMapper objectMapper = new ObjectMapper();
final HashMap<String,Object> configOutput = new HashMap<>();
final String vcap_services_str = System.getenv("VCAP_SERVICES");
if(vcap_services_str != null) {
try {
JsonNode rootNode = objectMapper.readTree(vcap_services_str);
/// ... parse VCAP_SERVICES and initialize the "db...." Play configuration into `configOutput`
} catch(IOException ex) {
LOGGER.log(Level.SEVERE, ex, () -> MessageFormat.format("Unable to parse VCAP_SERVICES content: {0}", vcap_services_str));
}
} else {
LOGGER.info("VCAP_SERVICES not defined");
}
Config configResult = ConfigFactory.parseMap(configOutput, "Environment Variables");
return configResult;
}
}
Then I activated this new application loader by creating an entry in the reference.conf configuration file under the conf folder.
play.application.loader=com.example.config.MyApplicationLoader

Avro serialization exception - java.time.Instant cannot be cast to java.lang.Long

I want to send a Kafka message with a payload of a class that extends SpecificRecordBase; it is a class that has been generated with the help of a maven plugin.
One of the fields of my schema has a type of timestamp-millis, which corresponds to the java.time.Instant in the generated class.
The field is defined as follows:
{"name": "processingTime", "type": {
"type": "long",
"logicalType": "timestamp-millis"
}
},
When I create an instance of this class and set the processing time,
setProcessingTime(RandomDate.randomInstant())
everything is ok, but when I run the program and try sending it to Kafka, I get the following error:
org.apache.kafka.common.errors.SerializationException: Can't convert value of class poc.avroGenerated.AvroMeasurement to class poc.avroSerde.AvroSerializer specified in value.serializer
Caused by: java.lang.ClassCastException: class java.time.Instant cannot be cast to class java.lang.Long (java.time.Instant and java.lang.Long are in module java.base of loader 'bootstrap')
Here's my custom serializer class:
#Override
public byte[] serialize(String topic, T data) {
byte[] result = null;
try {
ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream();
BinaryEncoder binaryEncoder = EncoderFactory.get().binaryEncoder(byteArrayOutputStream, null);
DatumWriter<GenericRecord> datumWriter = new GenericDatumWriter<>(data.getSchema());
datumWriter.write(data, binaryEncoder);
binaryEncoder.flush();
byteArrayOutputStream.close();
result = byteArrayOutputStream.toByteArray();
} catch (IOException e) {
LOGGER.error(e);
}
return result;
}
Use SpecificDatumWriter instead of GenericDatumWriter.
Drop in that one change, and your custom serializer looks fine!
This is frequently a point of confusion. In the Java implementation, "generic" datum do not take into account any customizations that were built into a specific record, including logical type conversions.

Ninja framework endpoint throws 500 error when trying to map JSON to custom object

So I've got a Ninja endpoint here:
public Result processRecurring(Context context, RecurOrderJSON recurOrderJSON) {
String id = recurOrderJSON.id;
String event_type = recurOrderJSON.event_type;
String request_id = recurOrderJSON.request_id;
//Map data = recurOrderJSON.data;
//recurringRouter(event_type, data);
log.info("ID value");
log.info(id);
return JsonResponse.build()
.message("OK")
.toResult();
}
The class I am trying to map to:
public class RecurOrderJSON {
public String id;
public String event_type;
public String request_id;
// Maybe switch data type?
//public Map data;
}
And the route:
router.POST().route("/recurring").with(RecurringController::processRecurring);
I am just trying to send some simple JSON to a webhook and for some reason the object mapping doesn't seem to be working. I think maybe I am misunderstanding the documentation?
http://www.ninjaframework.org/documentation/working_with_json_jsonp.html
The example they give you is this:
If you send that JSON to your application via the HTTP body you only need to add the POJO class to the controller method and Ninja will parse the incoming JSON for you:
package controllers;
public class ApplicationController {
public Result parsePerson(Person person) {
String nameOfPerson = person.name; // will be John Johnson
...
}
}
As far as I can tell, I am doing this correctly? Am I understanding the documentation wrong? Here's an example JSON object - currently I am only trying to grab the top level strings, but I'll eventually want to grab data as well:
{
"id": "hook-XXXXX",
"event_type": "tx-pending",
"data": {
"button_id": "static",
"publisher_organization": "org-XXXXXXX",
"campaign_id": "camp-097714a40aaf8965",
"currency": "USD",
"order_currency": "USD",
"id": "tx-XXXXXXX",
"category": "new-user-order",
"modified_date": "2018-10-15T05:41:12.577Z",
"order_total": 9680,
"button_order_id": "btnorder-77c9e56fd990f127",
"publisher_customer_id": "XymEz8GO2M",
"rate_card_id": "ratecard-41480b2a6b1196a7",
"advertising_id": null,
"event_date": "2018-10-15T05:41:06Z",
"status": "pending",
"pub_ref": null,
"account_id": "acc-4b17f5a014d0de1a",
"btn_ref": "srctok-0adf9e958510b3f1",
"order_id": null,
"posting_rule_id": null,
"order_line_items": [
{
"identifier": "Antique Trading Card",
"description": "Includes Lifetime Warranty",
"amount": 9680,
"publisher_commission": 968,
"attributes": {},
"total": 9680,
"quantity": 1
}
],
"order_click_channel": "webview",
"order_purchase_date": null,
"validated_date": null,
"amount": 968,
"customer_order_id": null,
"created_date": "2018-10-15T05:41:12.577Z",
"commerce_organization": "org-XXXXXX"
},
"request_id": "attempt-XXXXXXX"
}
Currently I am just trying to get the string values, yet I am constantly getting a 500 error and no other indication in my logs of any error.
As far as I can tell, Ninja should just automatically map the JSON to my object, correct?
I successfully reproduced your issue, and then fixed it.
First, for easy way to try/test, I recommend (temporary) modifications:
package controllers;
import models.RecurOrderJSON;
import ninja.Context;
import ninja.Result;
public class RecurringController {
public Result processRecurring(Context context, RecurOrderJSON recurOrderJSON) {
log.info("recurOrderJSON => " + recurOrderJSON);
return ninja.Results.ok();
}
}
And then, update your model this way:
package models;
import java.util.Map;
import com.fasterxml.jackson.annotation.JsonIgnoreProperties;
#JsonIgnoreProperties(ignoreUnknown = true)
public class RecurOrderJSON {
public String id;
public String event_type;
public String request_id;
public Map data;
#Override
public String toString() {
return "RecurOrderJSON [id=" + id + ", event_type=" + event_type + ", request_id=" + request_id + ", data="
+ data.toString() + "]";
}
}
You can notice:
The data type must stay raw (generic can't be used here)
the important #JsonIgnoreProperties(ignoreUnknown = true) annotation to avoid deserialize issue, if ever your source data does not perfectly match your model (be sure to use the recent version of annotation, in fasterxml sub-package, instead of the old one, in codehaus sub-package)
the toString() implementation only allowing quick check of OK/KO deserialization
Then you can easily test the system with wget, or curl:
curl -H 'Content-Type: application/json' -d "#/tmp/jsonINput.json" -X POST http://localhost:8080/recurring
Notice it is very important to specify the Content-type for good interpretation.
With the /tmp/jsonINput.json file containing exactly the json contents you specified in your question.
This way, everything is working like a charm, obtaining this output:
recurOrderJSON => RecurOrderJSON [id=hook-XXXXX, event_type=tx-pending, request_id=attempt-XXXXXXX, data={button_id=static, publisher_organization=org-XXXXXXX, campaign_id=camp-097714a40aaf8965, currency=USD, order_currency=USD, id=tx-XXXXXXX, category=new-user-order, modified_date=2018-10-15T05:41:12.577Z, order_total=9680, button_order_id=btnorder-77c9e56fd990f127, publisher_customer_id=XymEz8GO2M, rate_card_id=ratecard-41480b2a6b1196a7, advertising_id=null, event_date=2018-10-15T05:41:06Z, status=pending, pub_ref=null, account_id=acc-4b17f5a014d0de1a, btn_ref=srctok-0adf9e958510b3f1, order_id=null, posting_rule_id=null, order_line_items=[{identifier=Antique Trading Card, description=Includes Lifetime Warranty, amount=9680, publisher_commission=968, attributes={}, total=9680, quantity=1}], order_click_channel=webview, order_purchase_date=null, validated_date=null, amount=968, customer_order_id=null, created_date=2018-10-15T05:41:12.577Z, commerce_organization=org-XXXXXX}]
Given the specific input code with data field commented out
//public Map data;
and the posted input JSON that includes this field, the request should fail with 400 Bad Request.
The reason being that Ninja uses Jackson for JSON parsing and it will throw on unknown fields by default.
The quick workaround is to add #JsonIgnoreProperties annotation to RecurOrderJSON class.
e.g.
#JsonIgnoreProperties(ignoreUnknown = true)
public class RecurOrderJSON {
...
}
See: Ignoring new fields on JSON objects using Jackson
Now if the error was not 400 there isn't much information to go by as there doesn't seem to be anything else obviously wrong with the code.
Either post an SSCCE demonstrating the problem or attempt to debug by surfacing the error page with the following method:
Launch the application in debug mode with mvn package ninja:run
Access the end-point with a tool that allows to inspect the response in detail such as curl e.g.
Store request JSON in input.json
Run curl -v -o result.html -H 'Content-Type: application/json' --data '#input.json' http://localhost:8080/recurring
Open result.html to examine the response
Might it be that you are performing a bad request (hence the JSON is not found) but for some Ninja bug it returns error 500?
For example you can take a look here where is stated that parsing an empty JSON in a JSON request does leads to a misguiding error (500) while it is supposed to return 400 "Bad Request"
Context not needed in processRecurring and use Results.json() and return original
public Result processRecurring(RecurOrderJSON recurOrderJSON) {
String id = recurOrderJSON.id;
String event_type = recurOrderJSON.event_type;
String request_id = recurOrderJSON.request_id;
//Map data = recurOrderJSON.data;
//recurringRouter(event_type, data);
log.info("ID value");
log.info(id);
return Results.json().render(recurOrderJSON);
}
Make sure you get the namespace in your RecurOrderJSON
package models;
public class RecurOrderJSON {
public String id;
public String event_type;
public String request_id;
// Maybe switch data type?
//public Map data;
}
Good luck!

Cannot Load the file using getResourceAsStream

I am trying to learn Apache Avro and I have started with simple tutorial for Avro. I am trying to use a JSON Schema to load the data. Below is my Simple example-
public class AvroExample {
public static Schema SCHEMA; // writer's schema
public static Schema SCHEMA2; // reader's schema
private String name;
private int age;
private String[] mails;
private AvroExample boss;
static {
try {
SCHEMA = Schema.parse(AvroExample.class.getResourceAsStream("Employee.avsc"));
SCHEMA2 = Schema.parse(AvroExample.class.getResourceAsStream("Employee2.avsc"));
} catch (Exception e) {
System.out.println("Couldn't load a schema: " + e.getMessage());
}
}
// some more code
}
But somehow this line, always give me exception-
SCHEMA = Schema.parse(AvroExample.class.getResourceAsStream("Employee.avsc"));
as- Couldn't load a schema: java.lang.NullPointerException
I believe somehow, it is not able to load the file properly or I am loading the file in a wrong way.
This is the file content-
{
"type": "record",
"name": "Employee",
"fields": [
{"name": "name", "type": "string"},
{"name": "age", "type": "int"},
{"name": "emails", "type": {"type": "array", "items": "string"}},
{"name": "boss", "type": ["Employee","null"]}
]
}
Below is the picture of my workspace which shows where I have put those two avsc files-
Can anybody help me with this?
With the project setup you've shown us, your classpath will likely look like
/root
/Employee.avsc
/Employee2.avsc
/com
/rjamal
/avro
/test
/AvroExperiment
/...
In other words, the two avsc files will be at the root of the classpath. The method call
AvroExample.class.getResourceAsStream("Employee.avsc")
looks for the resource in the package the AvroExample class is in.
To make it relative to the root of the classpath, prefix your path with a /.
AvroExample.class.getResourceAsStream("/Employee.avsc")
Check the javadoc
Before delegation, an absolute resource name is constructed from the
given resource name using this algorithm:
If the name begins with a '/' ('\u002f'), then the absolute name of the resource is the portion of the name following the '/'.
Otherwise, the absolute name is of the following form: modified_package_name/name
Where the modified_package_name is the package name of this object with '/' substituted for '.' ('\u002e').
Emphasis mine.

Categories