I am getting the below error when using AWS lambda with Java.
I have created a Dukascopy SDK backed java service which has been deployed in AWS lambda but getting the below error:
020-03-17 03:33:49.421 INFO FilePathManager - WL info is unavailable. Use default App folder for Platform data: /home/sbx_user1051/JForex
java.lang.RuntimeException: java.io.IOException: Unable to create file [/temp/.cache/version.txt.lck], No such file or directory
at com.dukascopy.charts.data.datacache.LocalCacheManager.<init>(LocalCacheManager.java:141)
Please help me understand the issue.
There is no such folder as /temp. For lambda it should be /tmp.
Related
I finally reached the point where my Elastic Beanstalk Instance / Environment got launched. (Java Corretto 11 Platform) Now it fails starting up the provided .jar file.
In the eb-engine.log file, I am not able to find any more error than this:
2021/05/27 11:36:25.889735 [INFO] Executing instruction: StageJavaApplication
2021/05/27 11:36:25.889871 [ERROR] An error occurred during execution of command [app-deploy] - [StageJavaApplication]. Stop running the command. Error: staging java app failed due to invalid zip file
The jar file is a Spring Boot application built with mvn -B package.
Locally the whole thing starts, but crashes afterwards because of not given environment variables (Expected behaviour).
But it seems AWS is not even starting the application..
Any suggestions on this?
Spring Boot apps run nicely on Elastic Beanstalk. However, you do need to set some variables. For example, have you set server-port variable to 5000?
And as you stated, to successfully use a Service Client, you can set environment variables for your creds. Here is an end to end walkthrough that shows how to successfully put a Spring BOOT app that invokes several AWS Services on Elastic Beanstalk.
Creating your first AWS Java web application
PS - your log file mentions a ZIP file. Be sure to create the JAR properly as discussed in the above example.
Just in case someone arrive here looking for an answer about this guy:
Error: staging java app failed due to invalid zip file
I was renaming my service jar in Gradle, using:
tasks.withType<org.springframework.boot.gradle.tasks.bundling.BootJar> {
archiveFileName.set("service.jar")
launchScript()
}
And ElasticBeanstalk was not happy about the renaming.
When I let it have the default name, then no zip issues and all worked like a charm.
I receive this error message in jenkins when trying to deploy my application to kubernetes. Is there something I am missing?
istio-gateway.yml ERROR: ERROR: java.io.IOException: ERROR: YAML file
istio-gateway.yml is invalid, please check it. Details:
java.io.IOException: Unknown apiVersionKind:
networking.istio.io/v1alpha3/Gateway known kinds are...
I am using Spring Boot with Java
You are trying to create a Istio Geteway using that yaml. You probably don't have istio installed in your Kubernetes cluster. If it's not installed then you need to install it.
While using the AWS Toolkit for Eclipse, an error has been encountered when attempting to upload a Lambda function; the error is that that JAR creation failed. It is a simple "Hello World" application that was created to ensure that the environment is working; it follows the AWS tutorial: https://docs.amazonaws.cn/toolkit-for-eclipse/v1/user-guide/lambda-tutorial.html
The "Maven Build..." works fine to create the JAR without any error; following the steps in https://docs.amazonaws.cn/en_us/lambda/latest/dg/java-create-jar-pkg-maven-and-eclipse.html.
Eclipse is being run on a Mac and the JRE is Amazon Corretto 11 (11.0.4).
Any suggestions?
Here are the details of the errors:
Failed to upload project to Lambda
m.amazonaws.eclipse.core.exceptions.AwsActionException
at com.amazonaws.eclipse.lambda.upload.wizard.UploadFunctionWizard.doFinish(UploadFunctionWizard.java:115)
at com.amazonaws.eclipse.core.plugin.AbstractAwsJobWizard$1.run(AbstractAwsJobWizard.java:35)
at org.eclipse.core.internal.jobs.Worker.run(Worker.java:63)
Caused by: java.lang.NullPointerException
at com.amazonaws.eclipse.lambda.upload.wizard.util.UploadFunctionUtil.performFunctionUpload(UploadFunctionUtil.java:82)
at com.amazonaws.eclipse.lambda.upload.wizard.UploadFunctionWizard.doFinish(UploadFunctionWizard.java:111)
Unable to export project [testproject] to jar file
java.lang.reflect.InvocationTargetException: JAR creation failed. See details for additional information.
at org.eclipse.jdt.internal.ui.jarpackager.JarFileExportOperation.singleRun(JarFileExportOperation.java:1006)
at org.eclipse.jdt.internal.ui.jarpackager.JarFileExportOperation.execute(JarFileExportOperation.java:996)
at org.eclipse.ui.actions.WorkspaceModifyOperation.lambda$0(WorkspaceModifyOperation.java:110)
at org.eclipse.core.internal.resources.Workspace.run(Workspace.java:2295)
at org.eclipse.core.internal.resources.Workspace.run(Workspace.java:2322)
at org.eclipse.ui.actions.WorkspaceModifyOperation.run(WorkspaceModifyOperation.java:131)
at com.amazonaws.eclipse.lambda.upload.wizard.util.FunctionJarExportHelper.exportProjectToJarFile(FunctionJarExportHelper.java:78)
at com.amazonaws.eclipse.lambda.upload.wizard.util.UploadFunctionUtil.performFunctionUpload(UploadFunctionUtil.java:68)
at com.amazonaws.eclipse.lambda.upload.wizard.UploadFunctionWizard.doFinish(UploadFunctionWizard.java:111)
at com.amazonaws.eclipse.core.plugin.AbstractAwsJobWizard$1.run(AbstractAwsJobWizard.java:35)
at org.eclipse.core.internal.jobs.Worker.run(Worker.java:63)
I am using Jboss openshift tool to deploy my java web application on openshift 3 . I cannot find the solution. I am using Eclipse Oxygen.
Error while publishing server webapp (Service) at OpenShift 3 (console.starter-ca-central-1.openshift.com). Could not sync all pods to folder C:\Users\user\eclipse-workspace\.metadata\.plugins\org.jboss.ide.eclipse.as.core\webapp_(Service)_at_OpenShift_3_(console.starter-ca-central-1.openshift.com)\deploy
Syncing com.openshift.restclient.capability.resources.IRSyncable$PodPeer#5a251b05 to com.openshift.restclient.capability.resources.IRSyncable$LocalPeer#3f786c11 failed: WARNING: rsync command not found in path. Download cwRsync for Windows and add it to your PATH.
ERROR: Error extracting file "./.git/objects/pack/pack-349c37438963079e19219bb573c32ec43515bc59.idx": open C:\Users\user\eclipse-workspace\.metadata\.plugins\org.jboss.ide.eclipse.as.core\webapp_(Service)_at_OpenShift_3_(console.starter-ca-central-1.openshift.com)\deploy\.git\objects\pack\pack-349c37438963079e19219bb573c32ec43515bc59.idx: Access is denied.
ERROR: Error extracting tar stream
Ignoring the following flags because they only apply to rsync: --exclude, --no-perms
error: error extracting tar at destination directory: open C:\Users\user\eclipse-workspace\.metadata\.plugins\org.jboss.ide.eclipse.as.core\webapp_(Service)_at_OpenShift_3_(console.starter-ca-central-1.openshift.com)\deploy\.git\objects\pack\pack-349c37438963079e19219bb573c32ec43515bc59.idx: Access is denied.
I am struggling with Azure wasb on spark
I am reading loading a .json.gz file from disk and loading it into hdfs. I have used the following code extensively on other systems.
val file_a_raw = sqlContext.read.json('/home/users/repo_test/file_a.json.gz')
However, on Azure, this returns:
java.io.FileNotFoundException: Filewasb://server-2017-03-07t08-13-41-314z#server.blob.core.windows.net/home/users/repo_test/file_a.json.gz does not exist.
I have checked this location and the file is there and correct.
I think there should be a : between .net and then file path, but I get a java error trying to manually add that in.
java.lang.IllegalArgumentException: java.net.URISyntaxException: Expected scheme name at index 0:
I've also tried:
Filewasb:///home/users/repo_test/file_a.json.gz
But that returns:
java.io.IOException: No FileSystem for scheme: Filewasb
This code works fine on non Azure spark
For Azure, you'll need to configure Spark with the proper credentials. Databricks has documentation on this: https://docs.databricks.com/user-guide/faq/azure-blob-storage.html