Spring Boot App crashes at startup - java.lang.IllegalStateException - java

I am using EKS for the deployment of my Spring Boot App.
At startup, it crashes a couple of times, gets restarted by EKS, and finally starts serving requests.
Here are the logs: -
2022-04-20 13:08:51.836 INFO 1 --- [ main] trationDelegate$BeanPostProcessorChecker : Bean 'org.springframework.retry.annotation.RetryConfiguration' of type [org.springframework.retry.annotation.RetryConfiguration$$EnhancerBySpringCGLIB$$e8ec2216] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying)
2022-04-20 13:08:51.848 INFO 1 --- [ main] trationDelegate$BeanPostProcessorChecker : Bean 'org.springframework.cloud.autoconfigure.ConfigurationPropertiesRebinderAutoConfiguration' of type [org.springframework.cloud.autoconfigure.ConfigurationPropertiesRebinderAutoConfiguration$$EnhancerBySpringCGLIB$$f428cee] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying)
2022-04-20 13:09:42.455 ERROR 1 --- [ main] o.s.boot.SpringApplication : Application run failed
java.lang.IllegalStateException: Logback configuration error detected:
ERROR in c.q.l.core.rolling.SizeAndTimeBasedRollingPolicy#1248276879 - Unexpected exception while waiting for compression job to finish java.lang.InterruptedException
ERROR in c.q.l.core.rolling.SizeAndTimeBasedRollingPolicy#828088650 - Unexpected exception while waiting for compression job to finish java.lang.InterruptedException
ERROR in c.q.l.core.rolling.SizeAndTimeBasedRollingPolicy#1248276879 - Timeout while waiting for clean-up job to finish java.util.concurrent.TimeoutException
ERROR in c.q.l.core.rolling.SizeAndTimeBasedRollingPolicy#828088650 - Timeout while waiting for clean-up job to finish java.util.concurrent.TimeoutException
at org.springframework.boot.logging.logback.LogbackLoggingSystem.loadConfiguration(LogbackLoggingSystem.java:169)
at org.springframework.boot.logging.AbstractLoggingSystem.initializeWithConventions(AbstractLoggingSystem.java:82)
at org.springframework.boot.logging.AbstractLoggingSystem.initialize(AbstractLoggingSystem.java:60)
at org.springframework.boot.logging.logback.LogbackLoggingSystem.initialize(LogbackLoggingSystem.java:117)
at org.springframework.boot.context.logging.LoggingApplicationListener.initializeSystem(LoggingApplicationListener.java:290)
at org.springframework.boot.context.logging.LoggingApplicationListener.initialize(LoggingApplicationListener.java:263)
at org.springframework.boot.context.logging.LoggingApplicationListener.onApplicationEnvironmentPreparedEvent(LoggingApplicationListener.java:226)
at org.springframework.boot.context.logging.LoggingApplicationListener.onApplicationEvent(LoggingApplicationListener.java:199)
at org.springframework.context.event.SimpleApplicationEventMulticaster.doInvokeListener(SimpleApplicationEventMulticaster.java:172)
at org.springframework.context.event.SimpleApplicationEventMulticaster.invokeListener(SimpleApplicationEventMulticaster.java:165)
at org.springframework.context.event.SimpleApplicationEventMulticaster.multicastEvent(SimpleApplicationEventMulticaster.java:139)
at org.springframework.context.event.SimpleApplicationEventMulticaster.multicastEvent(SimpleApplicationEventMulticaster.java:127)
at org.springframework.boot.context.event.EventPublishingRunListener.environmentPrepared(EventPublishingRunListener.java:75)
at org.springframework.boot.SpringApplicationRunListeners.environmentPrepared(SpringApplicationRunListeners.java:54)
at org.springframework.boot.SpringApplication.prepareEnvironment(SpringApplication.java:347)
at org.springframework.boot.SpringApplication.run(SpringApplication.java:306)
at com.dt.Application.main(Application.java:20)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.boot.loader.MainMethodRunner.run(MainMethodRunner.java:48)
at org.springframework.boot.loader.Launcher.launch(Launcher.java:87)
at org.springframework.boot.loader.Launcher.launch(Launcher.java:50)
at org.springframework.boot.loader.JarLauncher.main(JarLauncher.java:51)
Exception in thread "main" java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.boot.loader.MainMethodRunner.run(MainMethodRunner.java:48)
at org.springframework.boot.loader.Launcher.launch(Launcher.java:87)
at org.springframework.boot.loader.Launcher.launch(Launcher.java:50)
at org.springframework.boot.loader.JarLauncher.main(JarLauncher.java:51)
First two lines are warnings as per my research so far. they shouldn't cause the app to fail (let me know if I am wrong).
Application run failed and 4 Logback errors were printed: -
In Logback's TimeBasedRollingPolicy, there's a start() and stop() method.
When the stop() method is called, two async jobs are run. Compression job and Clean up job. The compression job threw InterruptedException and the clean-up job threw TimeoutException.
I have a few questions: -
I am not sure whether these Logback errors caused the app to crash or vice-versa
What invoked the stop() method of TimeBasedRollingPolicy
What interrupted the Compression job?
If you have any idea why this is happening, please let me know.
Thanks :)

It says "Logback configuration error detected". I would check your log configuration file to see if you have a typo in it.

Related

Why I am getting java.io.FileNotFoundException while running spark job?

I am trying to run a spark job using below gcloud command.
gcloud dataproc jobs submit spark \
--cluster=clusterName \
--class=clazzName \
--jars=gs://abc/def/ghi.jar \
--region=us-central1 \
--files=gs://abc/def/jkl.json \
--properties=spark.driver.extraJavaOptions="-Dconfig.file=application_dev.json",spark.executor.extraJavaOptions="-Dconfig.file=application_dev.json",spark.executor.memory=6G,spark.driver.memory=4G,spark.executor.cores=3,spark.executor.instances=4
I am getting below error
ERROR org.apache.spark.SparkContext: Error initializing SparkContext.
java.io.FileNotFoundException: File not found: gs://temp-bucket/f80f8cb3-0358-445e-8ec2-819e4282bfe4/spark-job-history
Full stack trace
Waiting for job output...
22/09/02 05:30:47 INFO com.polaris.ihub.commons.utils.keymaker.KeymakerApi: ====== Reading App Context ======
22/09/02 05:30:47 INFO com.polaris.ihub.commons.utils.keymaker.KeymakerApi: File to be read from -> gs://abc/def/app_context.txt
22/09/02 05:30:49 INFO org.apache.spark.SparkEnv: Registering MapOutputTracker
22/09/02 05:30:49 INFO org.apache.spark.SparkEnv: Registering BlockManagerMaster
22/09/02 05:30:49 INFO org.apache.spark.SparkEnv: Registering BlockManagerMasterHeartbeat
22/09/02 05:30:49 INFO org.apache.spark.SparkEnv: Registering OutputCommitCoordinator
22/09/02 05:30:49 INFO org.sparkproject.jetty.util.log: Logging initialized #8633ms to org.sparkproject.jetty.util.log.Slf4jLog
22/09/02 05:30:50 INFO org.sparkproject.jetty.server.Server: jetty-9.4.40.v20210413; built: 2021-04-13T20:42:42.668Z; git: someAlphaNumeric1; jvm 1.8.0_322-b06
22/09/02 05:30:50 INFO org.sparkproject.jetty.server.Server: Started #8803ms
22/09/02 05:30:50 INFO org.sparkproject.jetty.server.AbstractConnector: Started ServerConnector#2db33feb{HTTP/1.1, (http/1.1)}{0.0.0.0:38111}
22/09/02 05:30:50 INFO org.apache.hadoop.yarn.client.RMProxy: Connecting to ResourceManager at clusterName-m/someIp:8032
22/09/02 05:30:51 INFO org.apache.hadoop.yarn.client.AHSProxy: Connecting to Application History server at clusterName-m/someIp:10200
22/09/02 05:30:51 INFO org.apache.hadoop.conf.Configuration: resource-types.xml not found
22/09/02 05:30:51 INFO org.apache.hadoop.yarn.util.resource.ResourceUtils: Unable to find 'resource-types.xml'.
22/09/02 05:30:55 INFO org.apache.hadoop.yarn.client.api.impl.YarnClientImpl: Submitted application application_appID
22/09/02 05:30:56 INFO org.apache.hadoop.yarn.client.RMProxy: Connecting to ResourceManager at clusterName-m/someIp:8030
22/09/02 05:30:57 ERROR org.apache.spark.SparkContext: Error initializing SparkContext.
java.io.FileNotFoundException: File not found: gs://temp-bucket/f80f8cb3-0358-445e-8ec2-819e4282bfe4/spark-job-history
at com.google.cloud.hadoop.fs.gcs.GoogleHadoopFileSystemBase.getFileStatus(GoogleHadoopFileSystemBase.java:958)
at org.apache.spark.deploy.history.EventLogFileWriter.requireLogBaseDirAsDirectory(EventLogFileWriters.scala:77)
at org.apache.spark.deploy.history.SingleEventLogFileWriter.start(EventLogFileWriters.scala:221)
at org.apache.spark.scheduler.EventLoggingListener.start(EventLoggingListener.scala:83)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:612)
at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2680)
at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:945)
at scala.Option.getOrElse(Option.scala:189)
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:939)
at claszzName2.spark.adaptor.utils.SparkUtils.newSparkSession(SparkUtils.java:42)
at claszzName2.spark.adaptor.bqtopubsub.BqToPubSubAdaptor.main(BqToPubSubAdaptor.java:30)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:951)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1039)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1048)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
22/09/02 05:30:57 INFO org.sparkproject.jetty.server.AbstractConnector: Stopped Spark#2db33feb{HTTP/1.1, (http/1.1)}{0.0.0.0:0}
22/09/02 05:30:58 ERROR clazzName: Spark Batch Application failed : {}
java.io.FileNotFoundException: File not found: gs://temp-bucket/f80f8cb3-0358-445e-8ec2-819e4282bfe4/spark-job-history
at com.google.cloud.hadoop.fs.gcs.GoogleHadoopFileSystemBase.getFileStatus(GoogleHadoopFileSystemBase.java:958)
at org.apache.spark.deploy.history.EventLogFileWriter.requireLogBaseDirAsDirectory(EventLogFileWriters.scala:77)
at org.apache.spark.deploy.history.SingleEventLogFileWriter.start(EventLogFileWriters.scala:221)
at org.apache.spark.scheduler.EventLoggingListener.start(EventLoggingListener.scala:83)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:612)
at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2680)
at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:945)
at scala.Option.getOrElse(Option.scala:189)
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:939)
at claszzName2.spark.adaptor.utils.SparkUtils.newSparkSession(SparkUtils.java:42)
at claszzName2.spark.adaptor.bqtopubsub.BqToPubSubAdaptor.main(BqToPubSubAdaptor.java:30)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:951)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1039)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1048)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Exception in thread "main" java.lang.RuntimeException
at clazzName.main(BqToPubSubAdaptor.java:37)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:951)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1039)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1048)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
ERROR: (gcloud.dataproc.jobs.submit.spark) Job [d8c3e7e5e8004e5bba72b921d454bfeb] failed with error:
Google Cloud Dataproc Agent reports job failure. If logs are available, they can be found at:
It seems GCP DataProc/spark is looking for default history server for event logging, which may not be located at specific location.
You can override two properties to update the logs location for history server.
spark.eventLog.dir
spark.history.fs.logDirectory
Either you can use these properties during submit of spark job, just like mentioned in example or you can update your spark-defaults.conf file
with these 2 properties.
Like
gcloud dataproc jobs submit spark \
--cluster=clusterName \
--class=clazzName \
--jars=gs://abc/def/ghi.jar \
--region=us-central1 \
--files=gs://abc/def/jkl.json \
--properties=spark.driver.extraJavaOptions="-Dconfig.file=application_dev.json",spark.executor.extraJavaOptions="-Dconfig.file=application_dev.json",spark.executor.memory=6G,spark.driver.memory=4G,spark.executor.cores=3,spark.executor.instances=4,spark.history.fs.logDirectory=gs://<bucket-name>/<folder-name>,spark.eventLog.dir=gs://<bucket-name>/<folder-name>

Exception in thread "main" java.lang.NoClassDefFoundError: com/fasterxml/jackson/datatype/jsr310/JavaTimeModule

I am following this tutorial to run Spark-Pi Application using kubectl command from here. https://github.com/GoogleCloudPlatform/spark-on-k8s-operator/blob/master/docs/quick-start-guide.md#running-the-examples
When I submit
kubectl apply -f spark-pi.yaml
and check the logs using kubectl logs spark-pi-driver -f , I see this exception.
20/03/20 01:47:45 INFO SparkEnv: Registering OutputCommitCoordinator
20/03/20 01:47:46 INFO Utils: Successfully started service 'SparkUI' on port 4040.
20/03/20 01:47:46 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://spark-pi-1584668857472-driver-svc.default.svc:4040
20/03/20 01:47:46 INFO SparkContext: Added JAR file:///opt/spark/examples/jars/spark-examples_2.11-2.4.3.jar at spark://spark-pi-1584668857472-driver-svc.default.svc:7078/jars/spark-examples_2.11-2.4.3.jar with timestamp 1584668866199
Exception in thread "main" java.lang.NoClassDefFoundError: com/fasterxml/jackson/datatype/jsr310/JavaTimeModule
at io.fabric8.kubernetes.client.dsl.base.OperationSupport.<clinit>(OperationSupport.java:59)
at io.fabric8.kubernetes.client.DefaultKubernetesClient.pods(DefaultKubernetesClient.java:204)
at org.apache.spark.scheduler.cluster.k8s.ExecutorPodsAllocator$$anonfun$1.apply(ExecutorPodsAllocator.scala:55)
at org.apache.spark.scheduler.cluster.k8s.ExecutorPodsAllocator$$anonfun$1.apply(ExecutorPodsAllocator.scala:55)
at scala.Option.map(Option.scala:146)
at org.apache.spark.scheduler.cluster.k8s.ExecutorPodsAllocator.<init>(ExecutorPodsAllocator.scala:55)
at org.apache.spark.scheduler.cluster.k8s.KubernetesClusterManager.createSchedulerBackend(KubernetesClusterManager.scala:89)
at org.apache.spark.SparkContext$.org$apache$spark$SparkContext$$createTaskScheduler(SparkContext.scala:2788)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:493)
at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2520)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:935)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:926)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:926)
at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:31)
at org.apache.spark.examples.SparkPi.main(SparkPi.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: com.fasterxml.jackson.datatype.jsr310.JavaTimeModule
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 28 more
20/03/20 01:47:47 INFO DiskBlockManager: Shutdown hook called
20/03/20 01:47:47 INFO ShutdownHookManager: Shutdown hook called
20/03/20 01:47:47 INFO ShutdownHookManager: Deleting directory /var/data/spark-97c7e689-9506-42a1-b3b1-578270832f75/spark-e99b532d-3c81-4a76-a05c-a4a753627db2/userFiles-7d18e8e4-74d6-4dbc-b967-31cd2c6d96d3
20/03/20 01:47:47 INFO ShutdownHookManager: Deleting directory /var/data/spark-97c7e689-9506-42a1-b3b1-578270832f75/spark-e99b532d-3c81-4a76-a05c-a4a753627db2
20/03/20 01:47:47 INFO ShutdownHookManager: Deleting directory /tmp/spark-72a95874-127a-4f0e-b32a-22d1aec74e1c```
Any help how to resolve this?
This jackson-annotations jar comes within jars folder in spark-2.4.3-bin-hadoop2.7. Not sure why it is not able to pick up in classpath. Any help would be appreciated.
As pointed out by #Andreas, ${SPARK_HOME}/jars doesn't contain jackson-datatype-jsr310.
You can try to modify spark-docker/Dockerfile and see how it works:
. . .
ADD https://repo1.maven.org/maven2/com/fasterxml/jackson/datatype/jackson-datatype-jsr310/2.9.10/jackson-datatype-jsr310-2.9.10.jar $SPARK_HOME/jars
. . .
It seems like a bug thought and if it helps - please rise the issue in the repo.
If you are using SpringBoot you should know that Jackson is added by default. so remove and it should work

spring cloud task taskLifecycleListener error when running in dataflow server

I implemented the sample batch-job sample given in spring cloud task samples. The spring boot version used in that is 2.0.1.RELEASE. Since I have to port another job I have to spring cloud task I wanted to find the oldest version compatible. The jar for batch-job created with 1.3.2.RELEASE as the spring-boot version runs successfully. But when I run it from the UI of the spring cloud dataflow after adding it as an app I get the following error:
2018-09-18 11:51:02.755 WARN 12368 --- [ main] s.c.a.AnnotationConfigApplicationContext : Exception encountered during context initialization - cancelling refresh attempt: org.springframework.context.ApplicationContextException: Failed to start bean 'taskLifecycleListener'; nested exception is java.lang.IllegalArgumentException: Invalid TaskExecution, ID 1 not found
2018-09-18 11:51:02.759 INFO 12368 --- [ main] o.s.j.e.a.AnnotationMBeanExporter : Unregistering JMX-exposed beans on shutdown
2018-09-18 11:51:02.763 ERROR 12368 --- [ main] o.s.c.t.listener.TaskLifecycleListener : An event to end a task has been received for a task that has not yet started.
2018-09-18 11:51:02.767 INFO 12368 --- [ main] o.s.j.d.e.EmbeddedDatabaseFactory : Shutting down embedded database: url='jdbc:h2:mem:testdb;DB_CLOSE_DELAY=-1;DB_CLOSE_ON_EXIT=false'
2018-09-18 11:51:02.781 ERROR 12368 --- [ main] o.s.boot.SpringApplication : Application startup failed
org.springframework.context.ApplicationContextException: Failed to start bean 'taskLifecycleListener'; nested exception is java.lang.IllegalArgumentException: Invalid TaskExecution, ID 1 not found
at org.springframework.context.support.DefaultLifecycleProcessor.doStart(DefaultLifecycleProcessor.java:176) ~[spring-context-4.2.4.RELEASE.jar!/:4.2.4.RELEASE]
at org.springframework.context.support.DefaultLifecycleProcessor.access$200(DefaultLifecycleProcessor.java:51) ~[spring-context-4.2.4.RELEASE.jar!/:4.2.4.RELEASE]
at org.springframework.context.support.DefaultLifecycleProcessor$LifecycleGroup.start(DefaultLifecycleProcessor.java:346) ~[spring-context-4.2.4.RELEASE.jar!/:4.2.4.RELEASE]
at org.springframework.context.support.DefaultLifecycleProcessor.startBeans(DefaultLifecycleProcessor.java:149) ~[spring-context-4.2.4.RELEASE.jar!/:4.2.4.RELEASE]
at org.springframework.context.support.DefaultLifecycleProcessor.onRefresh(DefaultLifecycleProcessor.java:112) ~[spring-context-4.2.4.RELEASE.jar!/:4.2.4.RELEASE]
at org.springframework.context.support.AbstractApplicationContext.finishRefresh(AbstractApplicationContext.java:852) ~[spring-context-4.2.4.RELEASE.jar!/:4.2.4.RELEASE]
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:541) ~[spring-context-4.2.4.RELEASE.jar!/:4.2.4.RELEASE]
at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:766) [spring-boot-1.3.2.RELEASE.jar!/:1.3.2.RELEASE]
at org.springframework.boot.SpringApplication.createAndRefreshContext(SpringApplication.java:361) [spring-boot-1.3.2.RELEASE.jar!/:1.3.2.RELEASE]
at org.springframework.boot.SpringApplication.run(SpringApplication.java:307) [spring-boot-1.3.2.RELEASE.jar!/:1.3.2.RELEASE]
at org.springframework.boot.SpringApplication.run(SpringApplication.java:1191) [spring-boot-1.3.2.RELEASE.jar!/:1.3.2.RELEASE]
at org.springframework.boot.SpringApplication.run(SpringApplication.java:1180) [spring-boot-1.3.2.RELEASE.jar!/:1.3.2.RELEASE]
at io.spring.BatchJobApplication.main(BatchJobApplication.java:14) [batch-job-2.0.1.BUILD-SNAPSHOT.jar!/:2.0.1.BUILD-SNAPSHOT]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_181]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_181]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_181]
at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_181]
at org.springframework.boot.loader.MainMethodRunner.run(MainMethodRunner.java:54) [batch-job-2.0.1.BUILD-SNAPSHOT.jar!/:2.0.1.BUILD-SNAPSHOT]
at java.lang.Thread.run(Thread.java:748) [na:1.8.0_181]
Caused by: java.lang.IllegalArgumentException: Invalid TaskExecution, ID 1 not found
at org.springframework.util.Assert.notNull(Assert.java:115) ~[spring-core-4.2.4.RELEASE.jar!/:4.2.4.RELEASE]
at org.springframework.cloud.task.listener.TaskLifecycleListener.doTaskStart(TaskLifecycleListener.java:233) ~[spring-cloud-task-core-2.0.1.BUILD-SNAPSHOT.jar!/:2.0.1.BUILD-SNAPSHOT]
at org.springframework.cloud.task.listener.TaskLifecycleListener.start(TaskLifecycleListener.java:355) ~[spring-cloud-task-core-2.0.1.BUILD-SNAPSHOT.jar!/:2.0.1.BUILD-SNAPSHOT]
at org.springframework.context.support.DefaultLifecycleProcessor.doStart(DefaultLifecycleProcessor.java:173) ~[spring-context-4.2.4.RELEASE.jar!/:4.2.4.RELEASE]
... 18 common frames omitted
Its mostly of Task not get registered in a given database. So When I got this Invalid TaskexecutionID issue, I passed database credentials to Dataflow server and same [DB details ] passed while launching a task. Generally both them should be in sink with DB credentials. finally all the TASK executions were recorded with TaskID, Execution Start time , Executio Endtime etc in that supplied DB.
task launch taskTimestmp2 --arguments "--spring.datasource.url=jdbc:mysql://localhost:3306/mydb --spring.datasource.username=root --spring.datasource.driver-class-name=org.mariadb.jdbc.Driver"

BeanCreationException when creating an ElasticSearchRepository - Could not resolve matching constructor

We have a spring boot application with maven. From today it started throwing the below error. There were few changes yesterday, we tried rolling back to a older version of code but still same error.
java.lang.IllegalStateException: Error processing condition on org.springframework.boot.devtools.autoconfigure.DevToolsDataSourceAutoConfiguration
at org.springframework.boot.autoconfigure.condition.SpringBootCondition.matches(SpringBootCondition.java:64)
at org.springframework.context.annotation.ConditionEvaluator.shouldSkip(ConditionEvaluator.java:102)
at org.springframework.context.annotation.ConfigurationClassBeanDefinitionReader$TrackedConditionEvaluator.shouldSkip(ConfigurationClassBeanDefinitionReader.java:436)
at org.springframework.context.annotation.ConfigurationClassBeanDefinitionReader$TrackedConditionEvaluator.shouldSkip(ConfigurationClassBeanDefinitionReader.java:425)
at org.springframework.context.annotation.ConfigurationClassBeanDefinitionReader.loadBeanDefinitionsForConfigurationClass(ConfigurationClassBeanDefinitionReader.java:127)
at org.springframework.context.annotation.ConfigurationClassBeanDefinitionReader.loadBeanDefinitions(ConfigurationClassBeanDefinitionReader.java:116)
at org.springframework.context.annotation.ConfigurationClassPostProcessor.processConfigBeanDefinitions(ConfigurationClassPostProcessor.java:336)
at org.springframework.context.annotation.ConfigurationClassPostProcessor.postProcessBeanDefinitionRegistry(ConfigurationClassPostProcessor.java:246)
at org.springframework.context.support.PostProcessorRegistrationDelegate.invokeBeanDefinitionRegistryPostProcessors(PostProcessorRegistrationDelegate.java:270)
at org.springframework.context.support.PostProcessorRegistrationDelegate.invokeBeanFactoryPostProcessors(PostProcessorRegistrationDelegate.java:93)
at org.springframework.context.support.AbstractApplicationContext.invokeBeanFactoryPostProcessors(AbstractApplicationContext.java:686)
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:524)
at org.springframework.boot.context.embedded.EmbeddedWebApplicationContext.refresh(EmbeddedWebApplicationContext.java:122)
at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:761)
at org.springframework.boot.SpringApplication.refreshContext(SpringApplication.java:371)
at org.springframework.boot.SpringApplication.run(SpringApplication.java:315)
at com.infy.mp.MarketplaceApp.main(MarketplaceApp.java:64)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.boot.devtools.restart.RestartLauncher.run(RestartLauncher.java:49)
Caused by: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'applicationSearchRepository': Could not resolve matching constructor (hint: specify index/type/name arguments for simple parameters to avoid type ambiguities)
at org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:240)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1148)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1050)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.getSingletonFactoryBeanForTypeCheck(AbstractAutowireCapableBeanFactory.java:881)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.getTypeForFactoryBean(AbstractAutowireCapableBeanFactory.java:808)
at org.springframework.beans.factory.support.AbstractBeanFactory.isTypeMatch(AbstractBeanFactory.java:544)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.doGetBeanNamesForType(DefaultListableBeanFactory.java:425)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:390)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:384)
at org.springframework.boot.devtools.autoconfigure.DevToolsDataSourceAutoConfiguration$DevToolsDataSourceCondition.getMatchOutcome(DevToolsDataSourceAutoConfiguration.java:133)
at org.springframework.boot.autoconfigure.condition.SpringBootCondition.matches(SpringBootCondition.java:47)
... 21 common frames omitted
2016-12-16 12:26:26.519 WARN 7772 --- [ restartedMain] ationConfigEmbeddedWebApplicationContext : Exception thrown from LifecycleProcessor on context close
java.lang.IllegalStateException: LifecycleProcessor not initialized - call 'refresh' before invoking lifecycle methods via the context: org.springframework.boot.context.embedded.AnnotationConfigEmbeddedWebApplicationContext#4435f708: startup date [Fri Dec 16 12:26:18 IST 2016]; root of context hierarchy
at org.springframework.context.support.AbstractApplicationContext.getLifecycleProcessor(AbstractApplicationContext.java:417)
at org.springframework.context.support.AbstractApplicationContext.doClose(AbstractApplicationContext.java:1002)
at org.springframework.context.support.AbstractApplicationContext.close(AbstractApplicationContext.java:961)
at org.springframework.boot.SpringApplication.handleRunFailure(SpringApplication.java:818)
at org.springframework.boot.SpringApplication.run(SpringApplication.java:326)
at com.infy.mp.MarketplaceApp.main(MarketplaceApp.java:64)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.boot.devtools.restart.RestartLauncher.run(RestartLauncher.java:49)
2016-12-16 12:26:26.520 ERROR 7772 --- [ restartedMain] o.s.b.f.s.DefaultListableBeanFactory : Destroy method on bean with name 'org.springframework.boot.autoconfigure.internalCachingMetadataReaderFactory' threw an exception
java.lang.IllegalStateException: ApplicationEventMulticaster not initialized - call 'refresh' before multicasting events via the context: org.springframework.boot.context.embedded.AnnotationConfigEmbeddedWebApplicationContext#4435f708: startup date [Fri Dec 16 12:26:18 IST 2016]; root of context hierarchy
at org.springframework.context.support.AbstractApplicationContext.getApplicationEventMulticaster(AbstractApplicationContext.java:404)
at org.springframework.context.support.ApplicationListenerDetector.postProcessBeforeDestruction(ApplicationListenerDetector.java:97)
at org.springframework.beans.factory.support.DisposableBeanAdapter.destroy(DisposableBeanAdapter.java:253)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.destroyBean(DefaultSingletonBeanRegistry.java:578)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.destroySingleton(DefaultSingletonBeanRegistry.java:554)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.destroySingleton(DefaultListableBeanFactory.java:954)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.destroySingletons(DefaultSingletonBeanRegistry.java:523)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.destroySingletons(DefaultListableBeanFactory.java:961)
at org.springframework.context.support.AbstractApplicationContext.destroyBeans(AbstractApplicationContext.java:1033)
at org.springframework.context.support.AbstractApplicationContext.doClose(AbstractApplicationContext.java:1009)
at org.springframework.context.support.AbstractApplicationContext.close(AbstractApplicationContext.java:961)
at org.springframework.boot.SpringApplication.handleRunFailure(SpringApplication.java:818)
at org.springframework.boot.SpringApplication.run(SpringApplication.java:326)
at com.infy.mp.MarketplaceApp.main(MarketplaceApp.java:64)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.boot.devtools.restart.RestartLauncher.run(RestartLauncher.java:49)
Below is the ApplicationSearchRepository interface
package com.test.mp.search.repository;
import org.springframework.data.elasticsearch.repository.ElasticsearchRepository;
import com.test.mp.domain.application.ApplicationElasticSearch;
public interface ApplicationSearchRepository extends ElasticsearchRepository<ApplicationElasticSearch, String> {
public ApplicationElasticSearch findOneByJobIdAndPosterIdAndSeekerProfileId(String jobid, String posterid, String seekerprofileid);
I'm quite new to java, please let me know if you need any other info.
Update
The issue resolved after updating maven dependency versions of spring-data-* packages like below (non-working versions in comments).
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-commons</artifactId>
<!--<version>1.13.0.BUILD-SNAPSHOT</version>-->
<version>1.13.0.M1</version>
</dependency>
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-cassandra</artifactId>
<!--<version>1.5.0.BUILD-SNAPSHOT</version>-->
<version>2.0.0.M1</version>
</dependency>
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-cql</artifactId>
<!--<version>1.5.0.BUILD-SNAPSHOT</version>-->
<version>2.0.0.M1</version>
</dependency>
But now the integration tests continue to throw the same error. There are no spring-data related dependencies in test scope except cassandra-unit. Any clue why tests are failing?
As updated in the question changing the maven version from BUILD-SNAPSHOT to milestone or release fixed the issue. It seems when using BUILD-SNAPSHOT, the jar files get updated in the Maven respoitory whenever there is a new build. Which seem to be the reason why our application which was working one day stopped working the next day.
And for the issue while running the tests in eclipse, as suggested in this SO Answer the .classpath file was referring to the BUILD-SNAPSHOT version of the jars. After removing those entries the tests also worked fine in eclipse.

How to make linux start with the spring boot after login?

I'm trying to boot spring start with after the Linux login more'm failing, can anyone tell me what am I doing wrong?
I've got two problems ..
I can run the application just by invoking the sudo user ,if it does not happen the following error
java.lang.IllegalStateException: Tomcat connector in failed state
at org.springframework.boot.context.embedded.tomcat.TomcatEmbeddedServletContainer.start(TomcatEmbeddedServletContainer.java:157)
at org.springframework.boot.context.embedded.EmbeddedWebApplicationContext.startEmbeddedServletContainer(EmbeddedWebApplicationContext.java:288)
at org.springframework.boot.context.embedded.EmbeddedWebApplicationContext.finishRefresh(EmbeddedWebApplicationContext.java:140)
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:483)
at org.springframework.boot.context.embedded.EmbeddedWebApplicationContext.refresh(EmbeddedWebApplicationContext.java:117)
at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:689)
at org.springframework.boot.SpringApplication.run(SpringApplication.java:321)
at org.springframework.boot.SpringApplication.run(SpringApplication.java:969)
at org.springframework.boot.SpringApplication.run(SpringApplication.java:958)
at br.com.nextinfo.MainApplication.main(MainApplication.java:28)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.springframework.boot.loader.MainMethodRunner.run(MainMethodRunner.java:53)
at java.lang.Thread.run(Thread.java:745)
2015-11-16 19:59:22.368 INFO 3400 --- [ main] ationConfigEmbeddedWebApplicationContext : Closing org.springframework.boot.context.embedded.AnnotationConfigEmbeddedWebApplicationContext#52f6002c: startup date [Mon Nov 16 19:59:13 BRST 2015]; root of context hierarchy
2015-11-16 19:59:22.369 INFO 3400 --- [ main] o.s.j.e.a.AnnotationMBeanExporter : Unregistering JMX-exposed beans on shutdown
java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.springframework.boot.loader.MainMethodRunner.run(MainMethodRunner.java:53)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.IllegalStateException: Tomcat connector in failed state
at org.springframework.boot.context.embedded.tomcat.TomcatEmbeddedServletContainer.start(TomcatEmbeddedServletContainer.java:157)
at org.springframework.boot.context.embedded.EmbeddedWebApplicationContext.startEmbeddedServletContainer(EmbeddedWebApplicationContext.java:288)
at org.springframework.boot.context.embedded.EmbeddedWebApplicationContext.finishRefresh(EmbeddedWebApplicationContext.java:140)
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:483)
at org.springframework.boot.context.embedded.EmbeddedWebApplicationContext.refresh(EmbeddedWebApplicationContext.java:117)
at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:689)
at org.springframework.boot.SpringApplication.run(SpringApplication.java:321)
at org.springframework.boot.SpringApplication.run(SpringApplication.java:969)
at org.springframework.boot.SpringApplication.run(SpringApplication.java:958)
at br.com.nextinfo.MainApplication.main(MainApplication.java:28)
... 6 more
To explain to me why this error occurs ..
How can I make application to run after the User login on linux, noting that the login occurs automatically without the sudo permission, is a jar should run with the permissions 755 right?
thank you advanced.
Should be straightforward.I believe you are using port 80 and not 8080:
Ports below 1024 are privileged ports and hence require sudo user
Refer Doc :
http://www.w3.org/Daemon/User/Installation/PrivilegedPorts.html
Either pass server port via Run VM arguments that is like -Dserver.port=8080 or search for
application.properties and specify server.port=8080 in it.
The above should work just fine.
Hope this helps.

Categories