I copied the mapreduce code of k-mean from below link.
craeted a jar using eclipse. ran it on numeric data set.
Please suggest what this exception means and how to avoid it.
https://github.com/himank/K-Means/blob/master/src/KMeans.java
16/01/21 09:06:40 INFO mapreduce.Job: map 0% reduce 0%
16/01/21 09:06:52 INFO mapreduce.Job: Task Id : attempt_1453356915728_0009_m_000000_0, Status : FAILED
Error: java.lang.RuntimeException: Error in configuring object
at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:109)
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:75)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:446)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:106)
... 9 more
Caused by: java.lang.RuntimeException: Error in configuring object
at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:109)
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:75)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
at org.apache.hadoop.mapred.MapRunner.configure(MapRunner.java:38)
... 14 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:106)
... 17 more
Caused by: java.lang.NumberFormatException: empty String
at sun.misc.FloatingDecimal.readJavaFormatString(FloatingDecimal.java:1020)
at java.lang.Double.parseDouble(Double.java:540)
at km.Kmean$Map.configure(Kmean.java:48)
... 22 more
Container killed by the ApplicationMaster.
16/01/21 09:06:52 INFO mapreduce.Job: Task Id : attempt_1453356915728_0009_m_000001_0, Status : FAILED
Error: java.lang.RuntimeException: Error in configuring object
at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:109)
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:75)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:446)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:106)
... 9 more
Caused by: java.lang.RuntimeException: Error in configuring object
at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:109)
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:75)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
at org.apache.hadoop.mapred.MapRunner.configure(MapRunner.java:38)
... 14 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:106)
... 17 more
Caused by: java.lang.NumberFormatException: empty String
at sun.misc.FloatingDecimal.readJavaFormatString(FloatingDecimal.java:1020)
at java.lang.Double.parseDouble(Double.java:540)
at km.Kmean$Map.configure(Kmean.java:48)
... 22 more
Related
JavaFX was working, but strangely, it throws Segmentation Fault (core dumped) now.
I don't know where the logs are kept, so I have launcher app/SceneBuilderApp.jar, just in case:
Exception in Application start method
java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.sun.javafx.application.LauncherImpl.launchApplicationWithArgs(LauncherImpl.java:389)
at com.sun.javafx.application.LauncherImpl.launchApplication(LauncherImpl.java:328)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at sun.launcher.LauncherHelper$FXHelper.main(LauncherHelper.java:767)
Caused by: java.lang.RuntimeException: Exception in Application start method
at com.sun.javafx.application.LauncherImpl.launchApplication1(LauncherImpl.java:917)
at com.sun.javafx.application.LauncherImpl.lambda$launchApplication$155(LauncherImpl.java:182)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.RuntimeException: Failed to load file:/opt/JavaFXSceneBuilder2.0/app/SceneBuilderApp.jar!/com/oracle/javafx/scenebuilder/app/DocumentWindow.fxml
at com.oracle.javafx.scenebuilder.kit.editor.panel.util.AbstractFxmlWindowController.makeRoot(AbstractFxmlWindowController.java:114)
at com.oracle.javafx.scenebuilder.kit.editor.panel.util.AbstractWindowController.getRoot(AbstractWindowController.java:91)
at com.oracle.javafx.scenebuilder.kit.editor.panel.util.AbstractWindowController.getScene(AbstractWindowController.java:111)
at com.oracle.javafx.scenebuilder.kit.editor.panel.util.AbstractWindowController.getStage(AbstractWindowController.java:134)
at com.oracle.javafx.scenebuilder.app.DocumentWindowController.openWindow(DocumentWindowController.java:1088)
at com.oracle.javafx.scenebuilder.app.SceneBuilderApp.handleLaunch(SceneBuilderApp.java:418)
at com.oracle.javafx.scenebuilder.app.AppPlatform.requestStartGeneric(AppPlatform.java:139)
at com.oracle.javafx.scenebuilder.app.AppPlatform.requestStart(AppPlatform.java:106)
at com.oracle.javafx.scenebuilder.app.SceneBuilderApp.start(SceneBuilderApp.java:371)
at com.sun.javafx.application.LauncherImpl.lambda$launchApplication1$162(LauncherImpl.java:863)
at com.sun.javafx.application.PlatformImpl.lambda$runAndWait$175(PlatformImpl.java:326)
at com.sun.javafx.application.PlatformImpl.lambda$null$173(PlatformImpl.java:295)
at java.security.AccessController.doPrivileged(Native Method)
at com.sun.javafx.application.PlatformImpl.lambda$runLater$174(PlatformImpl.java:294)
at com.sun.glass.ui.InvokeLaterDispatcher$Future.run(InvokeLaterDispatcher.java:95)
at com.sun.glass.ui.gtk.GtkApplication._runLoop(Native Method)
at com.sun.glass.ui.gtk.GtkApplication.lambda$null$49(GtkApplication.java:139)
... 1 more
Caused by: java.lang.RuntimeException: Failed to load file:/opt/JavaFXSceneBuilder2.0/app/SceneBuilderApp.jar!/com/oracle/javafx/scenebuilder/app/info/InfoPanel.fxml
at com.oracle.javafx.scenebuilder.kit.editor.panel.util.AbstractFxmlPanelController.makePanel(AbstractFxmlPanelController.java:93)
at com.oracle.javafx.scenebuilder.kit.editor.panel.util.AbstractPanelController.getPanelRoot(AbstractPanelController.java:130)
at com.oracle.javafx.scenebuilder.app.DocumentWindowController.controllerDidLoadFxml(DocumentWindowController.java:985)
at com.oracle.javafx.scenebuilder.kit.editor.panel.util.AbstractFxmlWindowController.makeRoot(AbstractFxmlWindowController.java:110)
... 17 more
Caused by: java.lang.RuntimeException: Failed to load file:/opt/JavaFXSceneBuilder2.0/app/SceneBuilderKit.jar!/com/oracle/javafx/scenebuilder/kit/editor/panel/inspector/editors/StringAutoSuggestEditor.fxml
at com.oracle.javafx.scenebuilder.kit.editor.panel.inspector.editors.EditorUtils.loadFxml(EditorUtils.java:483)
at com.oracle.javafx.scenebuilder.kit.editor.panel.inspector.editors.EditorUtils.loadFxml(EditorUtils.java:463)
at com.oracle.javafx.scenebuilder.kit.editor.panel.inspector.editors.AutoSuggestEditor.preInit(AutoSuggestEditor.java:124)
at com.oracle.javafx.scenebuilder.kit.editor.panel.inspector.editors.AutoSuggestEditor.<init>(AutoSuggestEditor.java:118)
at com.oracle.javafx.scenebuilder.kit.editor.panel.inspector.editors.AutoSuggestEditor.<init>(AutoSuggestEditor.java:103)
at com.oracle.javafx.scenebuilder.kit.editor.panel.inspector.editors.ControllerClassEditor.<init>(ControllerClassEditor.java:51)
at com.oracle.javafx.scenebuilder.app.info.InfoPanelController.performInitialization(InfoPanelController.java:186)
at com.oracle.javafx.scenebuilder.app.info.InfoPanelController.controllerDidLoadFxml(InfoPanelController.java:173)
at com.oracle.javafx.scenebuilder.kit.editor.panel.util.AbstractFxmlPanelController.makePanel(AbstractFxmlPanelController.java:89)
... 20 more
Caused by: javafx.fxml.LoadException:
file:/opt/JavaFXSceneBuilder2.0/app/SceneBuilderKit.jar!/com/oracle/javafx/scenebuilder/kit/editor/panel/inspector/editors/StringAutoSuggestEditor.fxml
at javafx.fxml.FXMLLoader.constructLoadException(FXMLLoader.java:2601)
at javafx.fxml.FXMLLoader.loadImpl(FXMLLoader.java:2571)
at javafx.fxml.FXMLLoader.loadImpl(FXMLLoader.java:2441)
at javafx.fxml.FXMLLoader.load(FXMLLoader.java:2409)
at com.oracle.javafx.scenebuilder.kit.editor.panel.inspector.editors.EditorUtils.loadFxml(EditorUtils.java:481)
... 28 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:71)
at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:275)
at javafx.fxml.FXMLLoader.loadImpl(FXMLLoader.java:2566)
... 31 more
Caused by: java.lang.NullPointerException
at com.oracle.javafx.scenebuilder.kit.editor.panel.inspector.editors.AutoSuggestEditor.initialize(AutoSuggestEditor.java:142)
... 41 more
Exception running application com.oracle.javafx.scenebuilder.app.SceneBuilderApp
Environment
Ubuntu 16.04
Java(TM) SE Runtime Environment (build 1.8.0_131-b11)
I had the same problem as you did.
As in the thread already mentioned this seems to be a kernel bug. Updating Ubuntu as usually + running dist-upgrade did the trick for me.
sudo apt-get update
sudo apt-get upgrade
sudo apt-get dist-upgrade
See here for information what dist-upgrade does.
I want to convert a RDD to Dataframe and I'm pretty sure I follow the rules on doing it but it seems that this error is keeping my code running.I don't know hot to resolve the thing duplicate class definition.
Here is my code in spark:
val auctionDF = auctionsRDD.toDF()
Heres the error.
java.lang.reflect.InvocationTargetException
at sun.reflect.GeneratedMethodAccessor2.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.mapr.fs.ShimLoader.injectNativeLoader(ShimLoader.java:284)
at com.mapr.fs.ShimLoader.load(ShimLoader.java:223)
at com.mapr.fs.MapRFileSystem.<clinit>(MapRFileSystem.java:107)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.hadoop.conf.Configuration.getClassByNameOrNull(Configuration.java:2147)
at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2112)
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2206)
at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2674)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2687)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:94)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2723)
at org.apache.hadoop.fs.FileSystem$Cache.getUnique(FileSystem.java:2711)
at org.apache.hadoop.fs.FileSystem.newInstance(FileSystem.java:454)
at org.apache.hadoop.fs.FileSystem.newInstance(FileSystem.java:462)
at org.apache.hadoop.fs.FileSystem.newInstance(FileSystem.java:444)
at org.apache.hadoop.hive.shims.Hadoop23Shims.getNonCachedFileSystem(Hadoop23Shims.java:944)
at org.apache.hadoop.hive.ql.exec.Utilities.createDirsWithPermission(Utilities.java:3687)
at org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:600)
at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:554)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:508)
at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:189)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:258)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:359)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:263)
at org.apache.spark.sql.hive.HiveSharedState.metadataHive$lzycompute(HiveSharedState.scala:39)
at org.apache.spark.sql.hive.HiveSharedState.metadataHive(HiveSharedState.scala:38)
at org.apache.spark.sql.hive.HiveSharedState.externalCatalog$lzycompute(HiveSharedState.scala:46)
at org.apache.spark.sql.hive.HiveSharedState.externalCatalog(HiveSharedState.scala:45)
at org.apache.spark.sql.hive.HiveSessionState.catalog$lzycompute(HiveSessionState.scala:50)
at org.apache.spark.sql.hive.HiveSessionState.catalog(HiveSessionState.scala:48)
at org.apache.spark.sql.hive.HiveSessionState$$anon$1.<init>(HiveSessionState.scala:63)
at org.apache.spark.sql.hive.HiveSessionState.analyzer$lzycompute(HiveSessionState.scala:63)
at org.apache.spark.sql.hive.HiveSessionState.analyzer(HiveSessionState.scala:62)
at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:49)
at org.apache.spark.sql.Dataset.<init>(Dataset.scala:161)
at org.apache.spark.sql.Dataset.<init>(Dataset.scala:167)
at org.apache.spark.sql.Dataset$.apply(Dataset.scala:59)
at org.apache.spark.sql.SparkSession.createDataset(SparkSession.scala:441)
at org.apache.spark.sql.SQLContext.createDataset(SQLContext.scala:395)
at org.apache.spark.sql.SQLImplicits.rddToDatasetHolder(SQLImplicits.scala:163)
at $line30.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(<console>:35)
at $line30.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(<console>:40)
at $line30.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(<console>:42)
at $line30.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(<console>:44)
at $line30.$read$$iw$$iw$$iw$$iw$$iw$$iw.<init>(<console>:46)
at $line30.$read$$iw$$iw$$iw$$iw$$iw.<init>(<console>:48)
at $line30.$read$$iw$$iw$$iw$$iw.<init>(<console>:50)
at $line30.$read$$iw$$iw$$iw.<init>(<console>:52)
at $line30.$read$$iw$$iw.<init>(<console>:54)
at $line30.$read$$iw.<init>(<console>:56)
at $line30.$read.<init>(<console>:58)
at $line30.$read$.<init>(<console>:62)
at $line30.$read$.<clinit>(<console>)
at $line30.$eval$.$print$lzycompute(<console>:7)
at $line30.$eval$.$print(<console>:6)
at $line30.$eval.$print(<console>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)
at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)
at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)
at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)
at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)
at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)
at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)
at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)
at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)
at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)
at scala.tools.nsc.interpreter.ILoop.loop(ILoop.scala:415)
at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:923)
at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)
at org.apache.spark.repl.Main$.doMain(Main.scala:68)
at org.apache.spark.repl.Main$.main(Main.scala:51)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:736)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.LinkageError: loader (instance of org/apache/spark/sql/hive/client/IsolatedClientLoader$$anon$1): attempted duplicate class definition for name: "com/mapr/fs/shim/LibraryLoader"
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
... 98 more
java.lang.RuntimeException: Failure loading MapRClient.
at com.mapr.fs.ShimLoader.injectNativeLoader(ShimLoader.java:305)
at com.mapr.fs.ShimLoader.load(ShimLoader.java:223)
at com.mapr.fs.MapRFileSystem.<clinit>(MapRFileSystem.java:107)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.hadoop.conf.Configuration.getClassByNameOrNull(Configuration.java:2147)
at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2112)
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2206)
at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2674)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2687)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:94)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2723)
at org.apache.hadoop.fs.FileSystem$Cache.getUnique(FileSystem.java:2711)
at org.apache.hadoop.fs.FileSystem.newInstance(FileSystem.java:454)
at org.apache.hadoop.fs.FileSystem.newInstance(FileSystem.java:462)
at org.apache.hadoop.fs.FileSystem.newInstance(FileSystem.java:444)
at org.apache.hadoop.hive.shims.Hadoop23Shims.getNonCachedFileSystem(Hadoop23Shims.java:944)
at org.apache.hadoop.hive.ql.exec.Utilities.createDirsWithPermission(Utilities.java:3687)
at org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:600)
at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:554)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:508)
at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:189)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:258)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:359)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:263)
at org.apache.spark.sql.hive.HiveSharedState.metadataHive$lzycompute(HiveSharedState.scala:39)
at org.apache.spark.sql.hive.HiveSharedState.metadataHive(HiveSharedState.scala:38)
at org.apache.spark.sql.hive.HiveSharedState.externalCatalog$lzycompute(HiveSharedState.scala:46)
at org.apache.spark.sql.hive.HiveSharedState.externalCatalog(HiveSharedState.scala:45)
at org.apache.spark.sql.hive.HiveSessionState.catalog$lzycompute(HiveSessionState.scala:50)
at org.apache.spark.sql.hive.HiveSessionState.catalog(HiveSessionState.scala:48)
at org.apache.spark.sql.hive.HiveSessionState$$anon$1.<init>(HiveSessionState.scala:63)
at org.apache.spark.sql.hive.HiveSessionState.analyzer$lzycompute(HiveSessionState.scala:63)
at org.apache.spark.sql.hive.HiveSessionState.analyzer(HiveSessionState.scala:62)
at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:49)
at org.apache.spark.sql.Dataset.<init>(Dataset.scala:161)
at org.apache.spark.sql.Dataset.<init>(Dataset.scala:167)
at org.apache.spark.sql.Dataset$.apply(Dataset.scala:59)
at org.apache.spark.sql.SparkSession.createDataset(SparkSession.scala:441)
at org.apache.spark.sql.SQLContext.createDataset(SQLContext.scala:395)
at org.apache.spark.sql.SQLImplicits.rddToDatasetHolder(SQLImplicits.scala:163)
at $line30.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(<console>:35)
at $line30.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(<console>:40)
at $line30.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(<console>:42)
at $line30.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(<console>:44)
at $line30.$read$$iw$$iw$$iw$$iw$$iw$$iw.<init>(<console>:46)
at $line30.$read$$iw$$iw$$iw$$iw$$iw.<init>(<console>:48)
at $line30.$read$$iw$$iw$$iw$$iw.<init>(<console>:50)
at $line30.$read$$iw$$iw$$iw.<init>(<console>:52)
at $line30.$read$$iw$$iw.<init>(<console>:54)
at $line30.$read$$iw.<init>(<console>:56)
at $line30.$read.<init>(<console>:58)
at $line30.$read$.<init>(<console>:62)
at $line30.$read$.<clinit>(<console>)
at $line30.$eval$.$print$lzycompute(<console>:7)
at $line30.$eval$.$print(<console>:6)
at $line30.$eval.$print(<console>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)
at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)
at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)
at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)
at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)
at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)
at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)
at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)
at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)
at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)
at scala.tools.nsc.interpreter.ILoop.loop(ILoop.scala:415)
at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:923)
at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)
at org.apache.spark.repl.Main$.doMain(Main.scala:68)
at org.apache.spark.repl.Main$.main(Main.scala:51)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:736)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.GeneratedMethodAccessor2.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.mapr.fs.ShimLoader.injectNativeLoader(ShimLoader.java:284)
... 94 more
Caused by: java.lang.LinkageError: loader (instance of org/apache/spark/sql/hive/client/IsolatedClientLoader$$anon$1): attempted duplicate class definition for name: "com/mapr/fs/shim/LibraryLoader"
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
... 98 more
java.lang.RuntimeException: Failure loading MapRClient.
at com.mapr.fs.ShimLoader.injectNativeLoader(ShimLoader.java:305)
at com.mapr.fs.ShimLoader.load(ShimLoader.java:223)
at com.mapr.fs.MapRFileSystem.<clinit>(MapRFileSystem.java:107)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.hadoop.conf.Configuration.getClassByNameOrNull(Configuration.java:2147)
at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2112)
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2206)
at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2674)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2687)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:94)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2723)
at org.apache.hadoop.fs.FileSystem$Cache.getUnique(FileSystem.java:2711)
at org.apache.hadoop.fs.FileSystem.newInstance(FileSystem.java:454)
at org.apache.hadoop.fs.FileSystem.newInstance(FileSystem.java:462)
at org.apache.hadoop.fs.FileSystem.newInstance(FileSystem.java:444)
at org.apache.hadoop.hive.shims.Hadoop23Shims.getNonCachedFileSystem(Hadoop23Shims.java:944)
at org.apache.hadoop.hive.ql.exec.Utilities.createDirsWithPermission(Utilities.java:3687)
at org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:600)
at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:554)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:508)
at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:189)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:258)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:359)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:263)
at org.apache.spark.sql.hive.HiveSharedState.metadataHive$lzycompute(HiveSharedState.scala:39)
at org.apache.spark.sql.hive.HiveSharedState.metadataHive(HiveSharedState.scala:38)
at org.apache.spark.sql.hive.HiveSharedState.externalCatalog$lzycompute(HiveSharedState.scala:46)
at org.apache.spark.sql.hive.HiveSharedState.externalCatalog(HiveSharedState.scala:45)
at org.apache.spark.sql.hive.HiveSessionState.catalog$lzycompute(HiveSessionState.scala:50)
at org.apache.spark.sql.hive.HiveSessionState.catalog(HiveSessionState.scala:48)
at org.apache.spark.sql.hive.HiveSessionState$$anon$1.<init>(HiveSessionState.scala:63)
at org.apache.spark.sql.hive.HiveSessionState.analyzer$lzycompute(HiveSessionState.scala:63)
at org.apache.spark.sql.hive.HiveSessionState.analyzer(HiveSessionState.scala:62)
at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:49)
at org.apache.spark.sql.Dataset.<init>(Dataset.scala:161)
at org.apache.spark.sql.Dataset.<init>(Dataset.scala:167)
at org.apache.spark.sql.Dataset$.apply(Dataset.scala:59)
at org.apache.spark.sql.SparkSession.createDataset(SparkSession.scala:441)
at org.apache.spark.sql.SQLContext.createDataset(SQLContext.scala:395)
at org.apache.spark.sql.SQLImplicits.rddToDatasetHolder(SQLImplicits.scala:163)
... 50 elided
Caused by: java.lang.reflect.InvocationTargetException: java.lang.LinkageError: loader (instance of org/apache/spark/sql/hive/client/IsolatedClientLoader$$anon$1): attempted duplicate class definition for name: "com/mapr/fs/shim/LibraryLoader"
at sun.reflect.GeneratedMethodAccessor2.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.mapr.fs.ShimLoader.injectNativeLoader(ShimLoader.java:284)
... 94 more
Caused by: java.lang.LinkageError: loader (instance of org/apache/spark/sql/hive/client/IsolatedClientLoader$$anon$1): attempted duplicate class definition for name: "com/mapr/fs/shim/LibraryLoader"
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
... 98 more
Exception:
Logging initialized using configuration in
jar:file:/usr/local/hive/lib/hive-common-2.1.0.jar!/hive-log4j2.properties
Async: true Exception in thread "main" java.lang.RuntimeException:
org.apache.hadoop.hive.ql.metadata.HiveException:
java.lang.RuntimeException: Unable to instantiate
org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient at
org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:578)
at
org.apache.hadoop.hive.ql.session.SessionState.beginStart(SessionState.java:518)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:705) at
org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:641) at
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606) at
org.apache.hadoop.util.RunJar.run(RunJar.java:221) at
org.apache.hadoop.util.RunJar.main(RunJar.java:136) Caused by:
org.apache.hadoop.hive.ql.metadata.HiveException:
java.lang.RuntimeException: Unable to instantiate
org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient at
org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:226)
at org.apache.hadoop.hive.ql.metadata.Hive.(Hive.java:366) at
org.apache.hadoop.hive.ql.metadata.Hive.create(Hive.java:310) at
org.apache.hadoop.hive.ql.metadata.Hive.getInternal(Hive.java:290) at
org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:266) at
org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:545)
... 9 more Caused by: java.lang.RuntimeException: Unable to
instantiate
org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient at
org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1627)
at
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.(RetryingMetaStoreClient.java:80)
at
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:130)
at
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:101)
at
org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3317)
at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3356) at
org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3336) at
org.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:3590)
at
org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:236)
at
org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:221)
... 14 more Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
Method) at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at
org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1625)
... 23 more Caused by: javax.jdo.JDOFatalInternalException: Error
creating transactional connection factory NestedThrowables:
java.lang.reflect.InvocationTargetException at
org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:671)
at
org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:834)
at
org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:338)
at
org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:217)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606) at
javax.jdo.JDOHelper$16.run(JDOHelper.java:1965) at
java.security.AccessController.doPrivileged(Native Method) at
javax.jdo.JDOHelper.invoke(JDOHelper.java:1960) at
javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
at
javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
at
javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
at
org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:424)
at
org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:453)
at
org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:327)
at
org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:294)
at
org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
at
org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
at
org.apache.hadoop.hive.metastore.RawStoreProxy.(RawStoreProxy.java:58)
at
org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:67)
at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:581)
at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:546)
at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:612)
at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:398)
at
org.apache.hadoop.hive.metastore.RetryingHMSHandler.(RetryingHMSHandler.java:78)
at
org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:84)
at
org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:6396)
at
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.(HiveMetaStoreClient.java:236)
at
org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.(SessionHiveMetaStoreClient.java:70)
... 28 more Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
Method) at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at
org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:606)
at
org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:330)
at
org.datanucleus.store.AbstractStoreManager.registerConnectionFactory(AbstractStoreManager.java:203)
at
org.datanucleus.store.AbstractStoreManager.(AbstractStoreManager.java:162)
at
org.datanucleus.store.rdbms.RDBMSStoreManager.(RDBMSStoreManager.java:284)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
Method) at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at
org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:606)
at
org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
at
org.datanucleus.NucleusContextHelper.createStoreManagerForProperties(NucleusContextHelper.java:133)
at
org.datanucleus.PersistenceNucleusContextImpl.initialise(PersistenceNucleusContextImpl.java:420)
at
org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:821)
... 57 more Caused by: org.datanucleus.exceptions.NucleusException:
Attempt to invoke the "BONECP" plugin to create a ConnectionPool gave
an error : The specified datastore driver
("org.apache.derby.jdbc.ClientDriver") was not found in the CLASSPATH.
Please check your CLASSPATH specification, and the name of the driver.
at
org.datanucleus.store.rdbms.ConnectionFactoryImpl.generateDataSources(ConnectionFactoryImpl.java:232)
at
org.datanucleus.store.rdbms.ConnectionFactoryImpl.initialiseDataSources(ConnectionFactoryImpl.java:117)
at
org.datanucleus.store.rdbms.ConnectionFactoryImpl.(ConnectionFactoryImpl.java:82)
... 75 more Caused by:
org.datanucleus.store.rdbms.connectionpool.DatastoreDriverNotFoundException:
The specified datastore driver ("org.apache.derby.jdbc.ClientDriver")
was not found in the CLASSPATH. Please check your CLASSPATH
specification, and the name of the driver. at
org.datanucleus.store.rdbms.connectionpool.AbstractConnectionPoolFactory.loadDriver(AbstractConnectionPoolFactory.java:58)
at
org.datanucleus.store.rdbms.connectionpool.BoneCPConnectionPoolFactory.createConnectionPool(BoneCPConnectionPoolFactory.java:54)
at
org.datanucleus.store.rdbms.ConnectionFactoryImpl.generateDataSources(ConnectionFactoryImpl.java:213)
... 77 more
hive-site.xml
<?xml version="1.0"?>
<configuration>
<property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:derby:;databaseName=/usr/local/hive/metastore_db;create=true</value>
<description>JDBC connect string for a JDBC metastore</description>
</property>
<property>
<name>javax.jdo.option.ConnectionDriverName</name>
<value>org.apache.derby.jdbc.ClientDriver</value>
<description>Driver class name for a JDBC metastore</description>
</property>
</configuration>
1) Replace value of javax.jdo.option.ConnectionDriverName with org.apache.derby.jdbc.EmbeddedDriver
or
2) Add http://central.maven.org/maven2/org/apache/derby/derbyclient/10.10.1.1/derbyclient-10.10.1.1.jar in hive-client/lib path
and try again
Hi I want to start my java application on raspberry I export DISPLAY:=0.0
and next I start a program like this : /opt/jdk1.8.0/bin/java -jar Main.jar
and I see this :
Exception in Application start method
java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at com.sun.javafx.application.LauncherImpl.launchApplicationWithArgs(LauncherImpl.java:367)
at com.sun.javafx.application.LauncherImpl.launchApplication(LauncherImpl.java:305)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at sun.launcher.LauncherHelper$FXHelper.main(LauncherHelper.java:767)
Caused by: java.lang.RuntimeException: Exception in Application start method
at com.sun.javafx.application.LauncherImpl.launchApplication1(LauncherImpl.java:894)
at com.sun.javafx.application.LauncherImpl.access$000(LauncherImpl.java:56)
at com.sun.javafx.application.LauncherImpl$1.run(LauncherImpl.java:158)
at java.lang.Thread.run(Thread.java:744)
Caused by: java.lang.IllegalArgumentException: Image dimensions must be positive (w,h > 0)
at javafx.scene.image.Image.<init>(Image.java:720)
at javafx.scene.image.WritableImage.<init>(WritableImage.java:74)
at application.Main4.makeTransparent(Main4.java:436)
at application.Main4.start(Main4.java:129)
at com.sun.javafx.application.LauncherImpl$8.run(LauncherImpl.java:837)
at com.sun.javafx.application.PlatformImpl$7.run(PlatformImpl.java:335)
at com.sun.javafx.application.PlatformImpl$6$1.run(PlatformImpl.java:301)
at com.sun.javafx.application.PlatformImpl$6$1.run(PlatformImpl.java:298)
at java.security.AccessController.doPrivileged(Native Method)
at com.sun.javafx.application.PlatformImpl$6.run(PlatformImpl.java:298)
at com.sun.glass.ui.lens.LensApplication$RunnableEvent.dispatch(LensApplication.java:182)
at com.sun.glass.ui.lens.LensApplication._runLoop(LensApplication.java:860)
at com.sun.glass.ui.lens.LensApplication.access$1800(LensApplication.java:58)
at com.sun.glass.ui.lens.LensApplication$4.run(LensApplication.java:917)
... 1 more
Exception running application application.Main4
The error message is:-
Error during execution of Sonar
Error status [command: /usr/lib/jvm/java-7-oracle/jre/bin/java -cp /tmp/sonar-runner-impl4646048652546322996.jar org.sonar.runner.impl.BatchLauncherMain /tmp/sonar-project1330416023146003621.properties]: 1
SonarQube Console:-
Exception in thread "main" org.sonar.runner.impl.RunnerException: Unable to execute Sonar
at org.sonar.runner.impl.BatchLauncher$1.delegateExecution(BatchLauncher.java:91)
at org.sonar.runner.impl.BatchLauncher$1.run(BatchLauncher.java:75)
at java.security.AccessController.doPrivileged(Native Method)
at org.sonar.runner.impl.BatchLauncher.doExecute(BatchLauncher.java:69)
at org.sonar.runner.impl.BatchLauncher.execute(BatchLauncher.java:50)
at org.sonar.runner.impl.BatchLauncherMain.execute(BatchLauncherMain.java:41)
at org.sonar.runner.impl.BatchLauncherMain.main(BatchLauncherMain.java:59)
Caused by: org.sonar.api.utils.HttpDownloader$HttpException: Fail to download [http://localhost:9000/batch_bootstrap/db?project=retailer]. Response code: 500
at org.sonar.api.utils.HttpDownloader$BaseHttpDownloader$HttpInputSupplier.getInput(HttpDownloader.java:281)
at org.sonar.api.utils.HttpDownloader$BaseHttpDownloader$HttpInputSupplier.getInput(HttpDownloader.java:235)
at com.google.common.io.ByteStreams.copy(ByteStreams.java:116)
at com.google.common.io.Files.copy(Files.java:231)
at org.sonar.batch.bootstrap.ServerClient.download(ServerClient.java:69)
at org.sonar.batch.bootstrap.PreviewDatabase.downloadDatabase(PreviewDatabase.java:101)
at org.sonar.batch.bootstrap.PreviewDatabase.start(PreviewDatabase.java:69)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.picocontainer.lifecycle.ReflectionLifecycleStrategy.invokeMethod(ReflectionLifecycleStrategy.java:110)
at org.picocontainer.lifecycle.ReflectionLifecycleStrategy.start(ReflectionLifecycleStrategy.java:89)
at org.picocontainer.injectors.AbstractInjectionFactory$LifecycleAdapter.start(AbstractInjectionFactory.java:84)
at org.picocontainer.behaviors.AbstractBehavior.start(AbstractBehavior.java:169)
at org.picocontainer.behaviors.Stored$RealComponentLifecycle.start(Stored.java:132)
at org.picocontainer.behaviors.Stored.start(Stored.java:110)
at org.picocontainer.DefaultPicoContainer.potentiallyStartAdapter(DefaultPicoContainer.java:1015)
at org.picocontainer.DefaultPicoContainer.startAdapters(DefaultPicoContainer.java:1008)
at org.picocontainer.DefaultPicoContainer.start(DefaultPicoContainer.java:766)
at org.sonar.api.platform.ComponentContainer.startComponents(ComponentContainer.java:91)
at org.sonar.api.platform.ComponentContainer.execute(ComponentContainer.java:77)
at org.sonar.batch.bootstrapper.Batch.startBatch(Batch.java:92)
at org.sonar.batch.bootstrapper.Batch.execute(Batch.java:74)
at org.sonar.runner.batch.IsolatedLauncher.execute(IsolatedLauncher.java:45)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.sonar.runner.impl.BatchLauncher$1.delegateExecution(BatchLauncher.java:87)
... 6 more