Errors while enabling ProGuard - Android Development - java

I'm getting following errors while 'Generate Signed APK' when I enable minify on build.gradle. Can anyone show me where is the issue? I'm unable to find since I'm new to android development and it doesn't show any class names related to my files
org.gradle.initialization.ReportedException: org.gradle.internal.exceptions.LocationAwareException: Execution failed for task ':app:transformClassesAndResourcesWithProguardForRelease'.
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.gradle.internal.exceptions.LocationAwareException: Execution failed for task ':app:transformClassesAndResourcesWithProguardForRelease'.
Caused by: org.gradle.api.tasks.TaskExecutionException: Execution failed for task ':app:transformClassesAndResourcesWithProguardForRelease'.
... 3 more
Caused by: java.lang.RuntimeException: Job failed, see logs for details
at com.android.build.gradle.internal.transforms.ProGuardTransform.transform(ProGuardTransform.java:196)
at com.android.build.gradle.internal.pipeline.TransformTask$2.call(TransformTask.java:221)
at com.android.build.gradle.internal.pipeline.TransformTask$2.call(TransformTask.java:217)
at com.android.builder.profile.ThreadRecorder.record(ThreadRecorder.java:102)

Related

Spark csv write to S3 from dataset failed with java.lang.NoSuchMethodError

I am using spark 3.2.2 and need to write a dataset to s3 in csv format.
I have both aws-java-sdk-bundle and hadoop-aws as
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-bundle</artifactId>
<version>1.11.901</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-aws</artifactId>
<version>3.3.1</version>
</dependency>
I have my spark s3 write code within my class from driver as:
public void saveRejectsS3(Dataset<Row> rejects,String date)
{
String path = "s3a://test-bucket/"+ date;
logger.info("s3 path:= " +path );
try {
rejects.show(1);
rejects.repartition(1).write().format("csv").option("header", "true").mode(SaveMode.Overwrite)
.save(path);
logger.info("saveS3 successful");
} catch (Exception e){
logger.error("error:"+e);
}
}
when I call this function I get error as
java.io.IOException: regular upload failed: java.lang.NoSuchMethodError: com.amazonaws.util.IOUtils.release(Ljava/io/Closeable;Lcom/amazonaws/thirdparty/apache/logging/Log;)V
STACKTRACE:
2022-11-29 03:24:22,182+0000 [main] WARN AbstractS3ACommitterFactory:90 - Using standard FileOutputCommitter to commit work. This is slow and potentially unsafe.
2022-11-29 03:24:24,233+0000 [task-result-getter-2] WARN TaskSetManager:69 - Lost task 0.0 in stage 304.0 (TID 26881) (10.47.202.71 executor 0): org.apache.spark.SparkException: Task failed while writing rows.
at org.apache.spark.sql.errors.QueryExecutionErrors$.taskFailedWhileWritingRowsError(QueryExecutionErrors.scala:500)
at org.apache.spark.sql.execution.datasources.FileFormatWriter$.executeTask(FileFormatWriter.scala:321)
at org.apache.spark.sql.execution.datasources.FileFormatWriter$.$anonfun$write$16(FileFormatWriter.scala:229)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:131)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:506)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1491)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:509)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.lang.IllegalStateException: Error closing the output.
at com.univocity.parsers.common.AbstractWriter.close(AbstractWriter.java:1000)
at org.apache.spark.sql.catalyst.csv.UnivocityGenerator.close(UnivocityGenerator.scala:110)
at org.apache.spark.sql.execution.datasources.csv.CsvOutputWriter.close(CsvOutputWriter.scala:48)
at org.apache.spark.sql.execution.datasources.FileFormatDataWriter.releaseCurrentWriter(FileFormatDataWriter.scala:64)
at org.apache.spark.sql.execution.datasources.FileFormatDataWriter.releaseResources(FileFormatDataWriter.scala:75)
at org.apache.spark.sql.execution.datasources.FileFormatDataWriter.commit(FileFormatDataWriter.scala:105)
at org.apache.spark.sql.execution.datasources.FileFormatWriter$.$anonfun$executeTask$1(FileFormatWriter.scala:305)
at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1525)
at org.apache.spark.sql.execution.datasources.FileFormatWriter$.executeTask(FileFormatWriter.scala:311)
... 9 more
Caused by: java.io.IOException: regular upload failed: java.lang.NoSuchMethodError: com.amazonaws.util.IOUtils.release(Ljava/io/Closeable;Lcom/amazonaws/thirdparty/apache/logging/Log;)V
at org.apache.hadoop.fs.s3a.S3AUtils.extractException(S3AUtils.java:340)
at org.apache.hadoop.fs.s3a.S3ABlockOutputStream.putObject(S3ABlockOutputStream.java:563)
at org.apache.hadoop.fs.s3a.S3ABlockOutputStream.close(S3ABlockOutputStream.java:380)
at org.apache.hadoop.fs.FSDataOutputStream$PositionCache.close(FSDataOutputStream.java:77)
at org.apache.hadoop.fs.FSDataOutputStream.close(FSDataOutputStream.java:106)
at sun.nio.cs.StreamEncoder.implClose(StreamEncoder.java:320)
at sun.nio.cs.StreamEncoder.close(StreamEncoder.java:149)
at java.io.OutputStreamWriter.close(OutputStreamWriter.java:233)
at com.univocity.parsers.common.AbstractWriter.close(AbstractWriter.java:996)
... 17 more
Caused by: java.lang.NoSuchMethodError: com.amazonaws.util.IOUtils.release(Ljava/io/Closeable;Lcom/amazonaws/thirdparty/apache/logging/Log;)V
at com.amazonaws.services.s3.model.S3DataSource$Utils.cleanupDataSource(S3DataSource.java:48)
at com.amazonaws.services.s3.AmazonS3Client.uploadObject(AmazonS3Client.java:1862)
at com.amazonaws.services.s3.AmazonS3Client.putObject(AmazonS3Client.java:1816)
at org.apache.hadoop.fs.s3a.S3AFileSystem.putObjectDirect(S3AFileSystem.java:2432)
at org.apache.hadoop.fs.s3a.WriteOperationHelper.lambda$putObject$6(WriteOperationHelper.java:517)
at org.apache.hadoop.fs.s3a.Invoker.once(Invoker.java:115)
at org.apache.hadoop.fs.s3a.Invoker.lambda$retry$4(Invoker.java:320)
at org.apache.hadoop.fs.s3a.Invoker.retryUntranslated(Invoker.java:412)
at org.apache.hadoop.fs.s3a.Invoker.retry(Invoker.java:316)
at org.apache.hadoop.fs.s3a.Invoker.retry(Invoker.java:291)
at org.apache.hadoop.fs.s3a.WriteOperationHelper.retry(WriteOperationHelper.java:168)
at org.apache.hadoop.fs.s3a.WriteOperationHelper.putObject(WriteOperationHelper.java:515)
at org.apache.hadoop.fs.s3a.S3ABlockOutputStream.lambda$putObject$0(S3ABlockOutputStream.java:548)
at org.apache.hadoop.thirdparty.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.hadoop.thirdparty.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:69)
at org.apache.hadoop.thirdparty.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at org.apache.hadoop.util.SemaphoredDelegatingExecutor$RunnableWithPermitRelease.run(SemaphoredDelegatingExecutor.java:196)
at org.apache.hadoop.util.SemaphoredDelegatingExecutor$RunnableWithPermitRelease.run(SemaphoredDelegatingExecutor.java:196)
... 3 more
2022-11-29 03:24:25,175+0000 [task-result-getter-0] ERROR TaskSetManager:73 - Task 0 in stage 304.0 failed 4 times; aborting job
2022-11-29 03:24:25,181+0000 [main] ERROR FileFormatWriter:94 - Aborting job 651308ed-f3d7-4e4b-ac87-9781282ec58a.
org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 304.0 failed 4 times, most recent failure: Lost task 0.3 in stage 304.0 (TID 26884) (10.47.202.71 executor 0): org.apache.spark.SparkException: Task failed while writing rows.
at org.apache.spark.sql.errors.QueryExecutionErrors$.taskFailedWhileWritingRowsError(QueryExecutionErrors.scala:500)
at org.apache.spark.sql.execution.datasources.FileFormatWriter$.executeTask(FileFormatWriter.scala:321)
at org.apache.spark.sql.execution.datasources.FileFormatWriter$.$anonfun$write$16(FileFormatWriter.scala:229)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:131)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:506)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1491)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:509)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.lang.IllegalStateException: Error closing the output.
at com.univocity.parsers.common.AbstractWriter.close(AbstractWriter.java:1000)
at org.apache.spark.sql.catalyst.csv.UnivocityGenerator.close(UnivocityGenerator.scala:110)
at org.apache.spark.sql.execution.datasources.csv.CsvOutputWriter.close(CsvOutputWriter.scala:48)
at org.apache.spark.sql.execution.datasources.FileFormatDataWriter.releaseCurrentWriter(FileFormatDataWriter.scala:64)
at org.apache.spark.sql.execution.datasources.FileFormatDataWriter.releaseResources(FileFormatDataWriter.scala:75)
at org.apache.spark.sql.execution.datasources.FileFormatDataWriter.commit(FileFormatDataWriter.scala:105)
at org.apache.spark.sql.execution.datasources.FileFormatWriter$.$anonfun$executeTask$1(FileFormatWriter.scala:305)
at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1525)
at org.apache.spark.sql.execution.datasources.FileFormatWriter$.executeTask(FileFormatWriter.scala:311)
... 9 more
Caused by: java.io.IOException: regular upload failed: java.lang.NoSuchMethodError: com.amazonaws.util.IOUtils.release(Ljava/io/Closeable;Lcom/amazonaws/thirdparty/apache/logging/Log;)V
at org.apache.hadoop.fs.s3a.S3AUtils.extractException(S3AUtils.java:340)
at org.apache.hadoop.fs.s3a.S3ABlockOutputStream.putObject(S3ABlockOutputStream.java:563)
at org.apache.hadoop.fs.s3a.S3ABlockOutputStream.close(S3ABlockOutputStream.java:380)
at org.apache.hadoop.fs.FSDataOutputStream$PositionCache.close(FSDataOutputStream.java:77)
at org.apache.hadoop.fs.FSDataOutputStream.close(FSDataOutputStream.java:106)
at sun.nio.cs.StreamEncoder.implClose(StreamEncoder.java:320)
at sun.nio.cs.StreamEncoder.close(StreamEncoder.java:149)
at java.io.OutputStreamWriter.close(OutputStreamWriter.java:233)
at com.univocity.parsers.common.AbstractWriter.close(AbstractWriter.java:996)
... 17 more
Caused by: java.lang.NoSuchMethodError: com.amazonaws.util.IOUtils.release(Ljava/io/Closeable;Lcom/amazonaws/thirdparty/apache/logging/Log;)V
at com.amazonaws.services.s3.model.S3DataSource$Utils.cleanupDataSource(S3DataSource.java:48)
at com.amazonaws.services.s3.AmazonS3Client.uploadObject(AmazonS3Client.java:1862)
at com.amazonaws.services.s3.AmazonS3Client.putObject(AmazonS3Client.java:1816)
at org.apache.hadoop.fs.s3a.S3AFileSystem.putObjectDirect(S3AFileSystem.java:2432)
at org.apache.hadoop.fs.s3a.WriteOperationHelper.lambda$putObject$6(WriteOperationHelper.java:517)
at org.apache.hadoop.fs.s3a.Invoker.once(Invoker.java:115)
at org.apache.hadoop.fs.s3a.Invoker.lambda$retry$4(Invoker.java:320)
at org.apache.hadoop.fs.s3a.Invoker.retryUntranslated(Invoker.java:412)
at org.apache.hadoop.fs.s3a.Invoker.retry(Invoker.java:316)
at org.apache.hadoop.fs.s3a.Invoker.retry(Invoker.java:291)
at org.apache.hadoop.fs.s3a.WriteOperationHelper.retry(WriteOperationHelper.java:168)
at org.apache.hadoop.fs.s3a.WriteOperationHelper.putObject(WriteOperationHelper.java:515)
at org.apache.hadoop.fs.s3a.S3ABlockOutputStream.lambda$putObject$0(S3ABlockOutputStream.java:548)
at org.apache.hadoop.thirdparty.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.hadoop.thirdparty.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:69)
at org.apache.hadoop.thirdparty.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at org.apache.hadoop.util.SemaphoredDelegatingExecutor$RunnableWithPermitRelease.run(SemaphoredDelegatingExecutor.java:196)
at org.apache.hadoop.util.SemaphoredDelegatingExecutor$RunnableWithPermitRelease.run(SemaphoredDelegatingExecutor.java:196)
... 3 more
Driver stacktrace:
at org.apache.spark.scheduler.DAGScheduler.failJobAndIndependentStages(DAGScheduler.scala:2454)
at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2(DAGScheduler.scala:2403)
at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2$adapted(DAGScheduler.scala:2402)
at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:2402)
at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1(DAGScheduler.scala:1160)
at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1$adapted(DAGScheduler.scala:1160)
at scala.Option.foreach(Option.scala:407)
at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:1160)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:2642)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2584)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2573)
at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49)
at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:938)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2214)
at org.apache.spark.sql.execution.datasources.FileFormatWriter$.write(FileFormatWriter.scala:218)
at org.apache.spark.sql.execution.datasources.InsertIntoHadoopFsRelationCommand.run(InsertIntoHadoopFsRelationCommand.scala:186)
at org.apache.spark.sql.execution.command.DataWritingCommandExec.sideEffectResult$lzycompute(commands.scala:113)
at org.apache.spark.sql.execution.command.DataWritingCommandExec.sideEffectResult(commands.scala:111)
at org.apache.spark.sql.execution.command.DataWritingCommandExec.executeCollect(commands.scala:125)
at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.$anonfun$applyOrElse$1(QueryExecution.scala:97)
at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$5(SQLExecution.scala:103)
at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:163)
at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:90)
at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:64)
at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:97)
at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:93)
at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:481)
at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:82)
at org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:481)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:30)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:267)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:263)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:30)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:30)
at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:457)
at org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:93)
at org.apache.spark.sql.execution.QueryExecution.commandExecuted$lzycompute(QueryExecution.scala:80)
at org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:78)
at org.apache.spark.sql.execution.QueryExecution.assertCommandExecuted(QueryExecution.scala:115)
at org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:848)
at org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:382)
at org.apache.spark.sql.DataFrameWriter.saveInternal(DataFrameWriter.scala:355)
at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:239)
at com.capitalone.customer.relationships.batch.impression.utils.ImpressionDataWriter.saveRejectsS3(ImpressionDataWriter.java:136)
at com.capitalone.customer.relationships.batch.impression.utils.ImpressionDataWriter.saveRejects(ImpressionDataWriter.java:118)
at com.capitalone.customer.relationships.batch.impression.spark.AccountEventProcessor.processInserts(AccountEventProcessor.java:114)
at com.capitalone.customer.relationships.batch.impression.spark.ImpressionSparkJob.processData(ImpressionSparkJob.java:221)
at com.capitalone.customer.relationships.batch.impression.spark.ImpressionSparkJob.execute(ImpressionSparkJob.java:106)
at com.capitalone.customer.relationships.batch.Impression.main(Impression.java:18)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.worker.DriverWrapper$.main(DriverWrapper.scala:63)
at org.apache.spark.deploy.worker.DriverWrapper.main(DriverWrapper.scala)
Caused by: org.apache.spark.SparkException: Task failed while writing rows.
at org.apache.spark.sql.errors.QueryExecutionErrors$.taskFailedWhileWritingRowsError(QueryExecutionErrors.scala:500)
at org.apache.spark.sql.execution.datasources.FileFormatWriter$.executeTask(FileFormatWriter.scala:321)
at org.apache.spark.sql.execution.datasources.FileFormatWriter$.$anonfun$write$16(FileFormatWriter.scala:229)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:131)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:506)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1491)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:509)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.lang.IllegalStateException: Error closing the output.
at com.univocity.parsers.common.AbstractWriter.close(AbstractWriter.java:1000)
at org.apache.spark.sql.catalyst.csv.UnivocityGenerator.close(UnivocityGenerator.scala:110)
at org.apache.spark.sql.execution.datasources.csv.CsvOutputWriter.close(CsvOutputWriter.scala:48)
at org.apache.spark.sql.execution.datasources.FileFormatDataWriter.releaseCurrentWriter(FileFormatDataWriter.scala:64)
at org.apache.spark.sql.execution.datasources.FileFormatDataWriter.releaseResources(FileFormatDataWriter.scala:75)
at org.apache.spark.sql.execution.datasources.FileFormatDataWriter.commit(FileFormatDataWriter.scala:105)
at org.apache.spark.sql.execution.datasources.FileFormatWriter$.$anonfun$executeTask$1(FileFormatWriter.scala:305)
at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1525)
at org.apache.spark.sql.execution.datasources.FileFormatWriter$.executeTask(FileFormatWriter.scala:311)
... 9 more
Caused by: java.io.IOException: regular upload failed: java.lang.NoSuchMethodError: com.amazonaws.util.IOUtils.release(Ljava/io/Closeable;Lcom/amazonaws/thirdparty/apache/logging/Log;)V
at org.apache.hadoop.fs.s3a.S3AUtils.extractException(S3AUtils.java:340)
at org.apache.hadoop.fs.s3a.S3ABlockOutputStream.putObject(S3ABlockOutputStream.java:563)
at org.apache.hadoop.fs.s3a.S3ABlockOutputStream.close(S3ABlockOutputStream.java:380)
at org.apache.hadoop.fs.FSDataOutputStream$PositionCache.close(FSDataOutputStream.java:77)
at org.apache.hadoop.fs.FSDataOutputStream.close(FSDataOutputStream.java:106)
at sun.nio.cs.StreamEncoder.implClose(StreamEncoder.java:320)
at sun.nio.cs.StreamEncoder.close(StreamEncoder.java:149)
at java.io.OutputStreamWriter.close(OutputStreamWriter.java:233)
at com.univocity.parsers.common.AbstractWriter.close(AbstractWriter.java:996)
... 17 more
Caused by: java.lang.NoSuchMethodError: com.amazonaws.util.IOUtils.release(Ljava/io/Closeable;Lcom/amazonaws/thirdparty/apache/logging/Log;)V
at com.amazonaws.services.s3.model.S3DataSource$Utils.cleanupDataSource(S3DataSource.java:48)
at com.amazonaws.services.s3.AmazonS3Client.uploadObject(AmazonS3Client.java:1862)
at com.amazonaws.services.s3.AmazonS3Client.putObject(AmazonS3Client.java:1816)
at org.apache.hadoop.fs.s3a.S3AFileSystem.putObjectDirect(S3AFileSystem.java:2432)
at org.apache.hadoop.fs.s3a.WriteOperationHelper.lambda$putObject$6(WriteOperationHelper.java:517)
at org.apache.hadoop.fs.s3a.Invoker.once(Invoker.java:115)
at org.apache.hadoop.fs.s3a.Invoker.lambda$retry$4(Invoker.java:320)
at org.apache.hadoop.fs.s3a.Invoker.retryUntranslated(Invoker.java:412)
at org.apache.hadoop.fs.s3a.Invoker.retry(Invoker.java:316)
at org.apache.hadoop.fs.s3a.Invoker.retry(Invoker.java:291)
at org.apache.hadoop.fs.s3a.WriteOperationHelper.retry(WriteOperationHelper.java:168)
at org.apache.hadoop.fs.s3a.WriteOperationHelper.putObject(WriteOperationHelper.java:515)
at org.apache.hadoop.fs.s3a.S3ABlockOutputStream.lambda$putObject$0(S3ABlockOutputStream.java:548)
at org.apache.hadoop.thirdparty.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.hadoop.thirdparty.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:69)
at org.apache.hadoop.thirdparty.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at org.apache.hadoop.util.SemaphoredDelegatingExecutor$RunnableWithPermitRelease.run(SemaphoredDelegatingExecutor.java:196)
at org.apache.hadoop.util.SemaphoredDelegatingExecutor$RunnableWithPermitRelease.run(SemaphoredDelegatingExecutor.java:196)
... 3 more
2022-11-29 03:24:25,747+0000 [main] ERROR ImpressionDataWriter:139 - error:org.apache.spark.SparkException: Job aborted
I get the above error and csv file isn't created while the date folder is being created in s3.
I have tried upgrading aws bundle to higher version and still getting the same error:

how to upload a web project with external jar in wildfly 11?

I need to upload my web project to Wildfly 11. The first time I tried to upload my project without external libraries and it worked correctly, but when I add new libraries it no longer allows me to upload in wildfly 11. the project runs fine if I run it as java in the Eclipse IDE.I generated war with export project in eclipse, the error that comes out is the following.
21:22:43,137 ERROR [org.jboss.msc.service.fail] (MSC service thread 1-8) MSC000001: Failed to start service jboss.deployment.unit."ApiPolizaApesegTest.war".POST_MODULE: org.jboss.msc.service.StartException in service jboss.deployment.unit."ApiPolizaApesegTest.war".POST_MODULE: WFLYSRV0153: Failed to process phase POST_MODULE of deployment "ApiPolizaApesegTest.war"
at org.jboss.as.server.deployment.DeploymentUnitPhaseService.start(DeploymentUnitPhaseService.java:172)
at org.jboss.msc.service.ServiceControllerImpl$StartTask.startService(ServiceControllerImpl.java:2032)
at org.jboss.msc.service.ServiceControllerImpl$StartTask.run(ServiceControllerImpl.java:1955)
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.lang.Thread.run(Unknown Source)
Caused by: java.lang.RuntimeException: WFLYSRV0177: Error getting reflective information for class imp.ws.soatcimtc.rimac.soap.SOAPImpl with ClassLoader ModuleClassLoader for Module "deployment.ApiPolizaApesegTest.war" from Service Module Loader
at org.jboss.as.server.deployment.reflect.DeploymentReflectionIndex.getClassIndex(DeploymentReflectionIndex.java:78)
at org.jboss.as.ee.metadata.MethodAnnotationAggregator.runtimeAnnotationInformation(MethodAnnotationAggregator.java:57)
at org.jboss.as.ee.component.deployers.InterceptorAnnotationProcessor.handleAnnotations(InterceptorAnnotationProcessor.java:106)
at org.jboss.as.ee.component.deployers.InterceptorAnnotationProcessor.processComponentConfig(InterceptorAnnotationProcessor.java:91)
at org.jboss.as.ee.component.deployers.InterceptorAnnotationProcessor.deploy(InterceptorAnnotationProcessor.java:76)
at org.jboss.as.server.deployment.DeploymentUnitPhaseService.start(DeploymentUnitPhaseService.java:165)
... 5 more
Caused by: java.lang.NoClassDefFoundError: com/amazonaws/Request
at java.lang.Class.getDeclaredFields0(Native Method)
at java.lang.Class.privateGetDeclaredFields(Unknown Source)
at java.lang.Class.getDeclaredFields(Unknown Source)
at org.jboss.as.server.deployment.reflect.ClassReflectionIndex.<init>(ClassReflectionIndex.java:72)
at org.jboss.as.server.deployment.reflect.DeploymentReflectionIndex.getClassIndex(DeploymentReflectionIndex.java:70)
... 10 more
Caused by: java.lang.ClassNotFoundException: com.amazonaws.Request from [Module "deployment.ApiPolizaApesegTest.war" from Service Module Loader]
at org.jboss.modules.ModuleClassLoader.findClass(ModuleClassLoader.java:198)
at org.jboss.modules.ConcurrentClassLoader.performLoadClassUnchecked(ConcurrentClassLoader.java:412)
at org.jboss.modules.ConcurrentClassLoader.performLoadClass(ConcurrentClassLoader.java:400)
at org.jboss.modules.ConcurrentClassLoader.loadClass(ConcurrentClassLoader.java:116)
... 15 more
21:22:43,142 ERROR [org.jboss.as.controller.management-operation] (External Management Request Threads -- 6) WFLYCTL0013: Operation ("add") failed - address: ([("deployment" => "ApiPolizaApesegTest.war")]) - failure description: {"WFLYCTL0080: Failed services" => {"jboss.deployment.unit.\"ApiPolizaApesegTest.war\".POST_MODULE" => "WFLYSRV0153: Failed to process phase POST_MODULE of deployment \"ApiPolizaApesegTest.war\"
Caused by: java.lang.RuntimeException: WFLYSRV0177: Error getting reflective information for class imp.ws.soatcimtc.rimac.soap.SOAPImpl with ClassLoader ModuleClassLoader for Module \"deployment.ApiPolizaApesegTest.war\" from Service Module Loader
Caused by: java.lang.NoClassDefFoundError: com/amazonaws/Request
Caused by: java.lang.ClassNotFoundException: com.amazonaws.Request from [Module \"deployment.ApiPolizaApesegTest.war\" from Service Module Loader]"}}
21:22:43,144 ERROR [org.jboss.as.server] (External Management Request Threads -- 6) WFLYSRV0021: Deploy of deployment "ApiPolizaApesegTest.war" was rolled back with the following failure message:
{"WFLYCTL0080: Failed services" => {"jboss.deployment.unit.\"ApiPolizaApesegTest.war\".POST_MODULE" => "WFLYSRV0153: Failed to process phase POST_MODULE of deployment \"ApiPolizaApesegTest.war\"
Caused by: java.lang.RuntimeException: WFLYSRV0177: Error getting reflective information for class imp.ws.soatcimtc.rimac.soap.SOAPImpl with ClassLoader ModuleClassLoader for Module \"deployment.ApiPolizaApesegTest.war\" from Service Module Loader
Caused by: java.lang.NoClassDefFoundError: com/amazonaws/Request
Caused by: java.lang.ClassNotFoundException: com.amazonaws.Request from [Module \"deployment.ApiPolizaApesegTest.war\" from Service Module Loader]"}}
I think that I might not recognize the external libraries, I add the libraries by jar
You need to package your "external libraries" into your war and redeploy the application using the new war.
Also WildFly 11 is quite old, I would recommend to upgrade to WildFly 23.

Sync Gradle projects with workspace failed: Execution failed for task

I am trying to import a fresh project into my eclipse photon workspace with JAVA 8 and Gradle 4.8.1. I see the following exception. I am not sure why it is trying to build tests while importing the project itself.
org.gradle.tooling.BuildException: Could not run build action using Gradle distribution 'https://services.gradle.org/distributions/gradle-4.8.1-all.zip'.
at org.gradle.tooling.internal.consumer.ExceptionTransformer.transform(ExceptionTransformer.java:51)
at org.gradle.tooling.internal.consumer.ExceptionTransformer.transform(ExceptionTransformer.java:29)
...
Caused by: org.gradle.internal.exceptions.LocationAwareException: Execution failed for task ':buildSrc:test'.
at org.gradle.initialization.DefaultExceptionAnalyser.transform(DefaultExceptionAnalyser.java:74)
at org.gradle.initialization.MultipleBuildFailuresExceptionAnalyser.transform(MultipleBuildFailuresExceptionAnalyser.java:47)
...
Caused by: org.gradle.api.tasks.TaskExecutionException: Execution failed for task ':buildSrc:test'.
at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeActions(ExecuteActionsTaskExecuter.java:110)
at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.execute(ExecuteActionsTaskExecuter.java:77)
...
... 3 more
Caused by: org.gradle.api.GradleException: There were failing tests. See the report at: file:///C:/sources/elasticsearch-master/buildSrc/build-bootstrap/reports/tests/test/index.html
at org.gradle.api.tasks.testing.AbstractTestTask.handleTestFailures(AbstractTestTask.java:612)
at org.gradle.api.tasks.testing.AbstractTestTask.executeTests(AbstractTestTask.java:484)
at org.gradle.api.tasks.testing.Test.executeTests(Test.java:603)
...
... 30 more
UPDATE
Both didn't work however am using 4.4.1 version now :
"Loading Gradle project preview faileddue to error in the referenced gradle build"
org.gradle.tooling.BuildException: Could not fetch model of type 'GradleBuild' using Gradle installation 'C:\gradle'.
at org.gradle.tooling.internal.consumer.ExceptionTransformer.transform(ExceptionTransformer.java:51)
at org.gradle.tooling.internal.consumer.ExceptionTransformer.transform(ExceptionTransformer.java:29)
at org.gradle.tooling.internal.consumer.ResultHandlerAdapter.onFailure(ResultHandlerAdapter.java:41)
...
Caused by: org.gradle.internal.exceptions.LocationAwareException: Execution failed for task ':buildSrc:compileGroovy'.
at org.gradle.initialization.DefaultExceptionAnalyser.transform(DefaultExceptionAnalyser.java:74)
at org.gradle.initialization.MultipleBuildFailuresExceptionAnalyser.transform(MultipleBuildFailuresExceptionAnalyser.java:47)
...
Caused by: org.gradle.api.tasks.TaskExecutionException: Execution failed for task ':buildSrc:compileGroovy'.
at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeActions(ExecuteActionsTaskExecuter.java:100)
at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.execute(ExecuteActionsTaskExecuter.java:70)
...
... 95 more
Caused by: org.gradle.api.internal.tasks.compile.CompilationFailedException: Compilation failed; see the compiler error output for details.
at org.gradle.api.internal.tasks.compile.ApiGroovyCompiler.execute(ApiGroovyCompiler.java:180)
at org.gradle.api.internal.tasks.compile.ApiGroovyCompiler.execute(ApiGroovyCompiler.java:56)
...
... 3 more

Jenkins Build Error : java.lang.NoClassDefFoundError: org/tmatesoft/sqljet/core/table/SqlJetTimeoutBusyHandler

Error Log:
Started by user jenkins
Building in workspace /www/dev/apps/jenkins/workspace/app_11001
FATAL: org/tmatesoft/sqljet/core/table/SqlJetTimeoutBusyHandler
java.lang.NoClassDefFoundError: org/tmatesoft/sqljet/core/table/SqlJetTimeoutBusyHandler
at org.tmatesoft.svn.core.internal.db.SVNSqlJetDb.<clinit>(SVNSqlJetDb.java:49)
at org.tmatesoft.svn.core.internal.wc17.db.SVNWCDb.openDb(SVNWCDb.java:4775)
at org.tmatesoft.svn.core.internal.wc17.db.SVNWCDb.parseDir(SVNWCDb.java:1849)
at org.tmatesoft.svn.core.wc2.SvnOperationFactory.detectWcGeneration(SvnOperationFactory.java:1555)
at org.tmatesoft.svn.core.wc2.SvnOperationFactory.getImplementation(SvnOperationFactory.java:1219)
at org.tmatesoft.svn.core.wc2.SvnOperationFactory.run(SvnOperationFactory.java:1138)
at org.tmatesoft.svn.core.wc2.SvnOperation.run(SvnOperation.java:294)
at org.tmatesoft.svn.core.wc.SVNWCClient.doInfo(SVNWCClient.java:2485)
at hudson.scm.subversion.UpdateUpdater$TaskImpl.parseSvnInfo(UpdateUpdater.java:126)
at hudson.scm.subversion.UpdateUpdater$TaskImpl.getSvnCommandToUse(UpdateUpdater.java:88)
at hudson.scm.subversion.UpdateUpdater$TaskImpl.perform(UpdateUpdater.java:131)
at hudson.scm.subversion.WorkspaceUpdater$UpdateTask.delegateTo(WorkspaceUpdater.java:162)
at hudson.scm.SubversionSCM$CheckOutTask.perform(SubversionSCM.java:1001)
at hudson.scm.SubversionSCM$CheckOutTask.invoke(SubversionSCM.java:977)
at hudson.scm.SubversionSCM$CheckOutTask.invoke(SubversionSCM.java:953)
at hudson.FilePath.act(FilePath.java:1018)
at hudson.FilePath.act(FilePath.java:996)
at hudson.scm.SubversionSCM.checkout(SubversionSCM.java:902)
at hudson.scm.SubversionSCM.checkout(SubversionSCM.java:838)
at hudson.scm.SCM.checkout(SCM.java:495)
at hudson.model.AbstractProject.checkout(AbstractProject.java:1278)
at hudson.model.AbstractBuild$AbstractBuildExecution.defaultCheckout(AbstractBuild.java:604)
at jenkins.scm.SCMCheckoutStrategy.checkout(SCMCheckoutStrategy.java:86)
at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:529)
at hudson.model.Run.execute(Run.java:1720)
at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
at hudson.model.ResourceController.execute(ResourceController.java:98)
at hudson.model.Executor.run(Executor.java:404)
Caused by: java.lang.ClassNotFoundException: org.tmatesoft.sqljet.core.table.SqlJetTimeoutBusyHandler
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 28 more
Finished: FAILURE
I've also added sqljet.jar file in my /lib folder but that doesn't helped me. when I click on build and within some sort of time I am getting same error and build failed.
I am using Apache Tomcat 8 for Jenkins.

Jenkins sonar check failure: Caused by: java.lang.ClassNotFoundException: org.sonar.api.checks.NoSonarFilter

When I call Jenkins build job, It throw following error and job failed.*
Can anyone have saw this error?
Jenkins version: 1.642.1
Sonar version: 5.4 (Java plugin 3.5)
Here is the error info:
[ERROR] Failed to execute goal org.sonarsource.scanner.maven:sonar-maven-plugin:3.0.1:sonar (default-cli) on project ods-replicator: Unable to register extension org.sonar.plugins.java.JavaSquidSensor: Lorg/sonar/api/checks/NoSonarFilter;: org.sonar.api.checks.NoSonarFilter -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.sonarsource.scanner.maven:sonar-maven-plugin:3.0.1:sonar (default-cli) on project ods-replicator: Unable to register extension org.sonar.plugins.java.JavaSquidSensor
......
at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:68)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.maven.plugin.MojoExecutionException: Unable to register extension org.sonar.plugins.java.JavaSquidSensor
at org.sonarsource.scanner.maven.bootstrap.ExceptionHandling.handle(ExceptionHandling.java:36)
at org.sonarsource.scanner.maven.bootstrap.RunnerBootstrapper.execute(RunnerBootstrapper.java:81)
at org.sonarsource.scanner.maven.SonarQubeMojo.execute(SonarQubeMojo.java:112)
at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:134)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:208)
... 31 more
Caused by: java.lang.IllegalStateException: Unable to register extension org.sonar.plugins.java.JavaSquidSensor
at org.sonar.core.platform.ComponentContainer.addExtension(ComponentContainer.java:241)
...
... 34 more
Caused by: java.lang.NoClassDefFoundError: Lorg/sonar/api/checks/NoSonarFilter;
at java.lang.Class.getDeclaredFields0(Native Method)
at java.lang.Class.privateGetDeclaredFields(Class.java:2583)
at java.lang.Class.getDeclaredFields(Class.java:1916)
at org.picocontainer.injectors.AdaptingInjection$2.run(AdaptingInjection.java:217)
at java.security.AccessController.doPrivileged(Native Method)
at org.picocontainer.injectors.AdaptingInjection.injectionFieldAnnotated(AdaptingInjection.java:209)
at org.picocontainer.injectors.AdaptingInjection.fieldAnnotatedInjectionAdapter(AdaptingInjection.java:188)
at org.picocontainer.injectors.AdaptingInjection.createComponentAdapter(AdaptingInjection.java:57)
at org.picocontainer.behaviors.AbstractBehaviorFactory.createComponentAdapter(AbstractBehaviorFactory.java:44)
at org.picocontainer.behaviors.OptInCaching.createComponentAdapter(OptInCaching.java:45)
at org.picocontainer.DefaultPicoContainer.addComponent(DefaultPicoContainer.java:536)
at org.picocontainer.DefaultPicoContainer.access$300(DefaultPicoContainer.java:84)
at org.picocontainer.DefaultPicoContainer$AsPropertiesPicoContainer.addComponent(DefaultPicoContainer.java:1149)
at org.sonar.core.platform.ComponentContainer.addExtension(ComponentContainer.java:239)
... 63 more
Caused by: java.lang.ClassNotFoundException: org.sonar.api.checks.NoSonarFilter
at org.sonar.classloader.ParentFirstStrategy.loadClass(ParentFirstStrategy.java:39)
at org.sonar.classloader.ClassRealm.loadClass(ClassRealm.java:87)
at org.sonar.classloader.ClassRealm.loadClass(ClassRealm.java:76)
... 77 more
[ERROR]
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
The listed compatibility for the version 3.5 of the Java plugin ends at 5.1. There have been API changes in the 5.x series, so this can probably be solved with an upgrade of the Java plugin.

Categories