I want to try out java in aws, specifically lambda.
I don't know much java so hoping to learn it in this endeavor. Please be gentle to a newbie
Trying to get hello world working so I can start iterating but I don't know the syntax.
My java program:
$ cat helloWorld.java
class helloWorld
{
public static void main(String args[])
{
System.out.println("Hello, World");
}
}
I zip it up with zip java above_program.java
The call i make in the lambda is to helloWorld::main
The result I currently get when I test it is
START RequestId: 32ef4680-b741-402d-9af7-7c0b0c9e5f1f Version: $LATEST
Class not found: helloWorld: java.lang.ClassNotFoundException
java.lang.ClassNotFoundException: helloWorld
Lambda will not compile and run your java code by only uploading the file to lambda, for compilation language you have to upload complete package where for interpreter language like python you just need upload the code and also you will able to edit code in the console editor for python or nodejs etc but the same feature is applicable for complication language, you have to upload the deployment package.
AWS Lambda Deployment Package in Java
Your deployment package can be a .zip file or a standalone jar; it is
your choice. You can use any build and packaging tool you are familiar
with to create a deployment package.
We provide examples of using Maven to create standalone jars and using Gradle to create a .zip file. For more information, see the following topics:
Topics
Creating a .jar Deployment Package Using Maven without any IDE (Java)
Creating a .jar Deployment Package Using Maven and Eclipse IDE (Java)
Creating a ZIP Deployment Package for a Java Function
Authoring Lambda Functions Using Eclipse IDE and AWS SDK Plugin (Java)
You can read more details here
Related
I have several Spark big data applications written in Scala. These applications have their other version written in R.
I also have a web server application written in Java. This is provided as API for the web GUI. The purpose is to enable GUI to execute these applications and choose the version: R or Spark. I managed to call the R code from the Java API and get the result to JSON. But now it seems to be quite complicated to execute the Spark programs.
Until now, I was able to merge one of the Scala .jar file with the Java API with Maven. I do this by placing my Spark program as a local repository in pom.xml so that the Scala code is included in the final .jar package. I also mentioned Scala and breeze library as dependencies in the pom.xml. And when I try to send a request with the API, of course it throws an error saying java.lang.NoClassDefFoundError: org/apache/spark/sql/SparkSession$. By this point, I realize that it was because I haven't mentioned Spark library in Maven dependencies, but then I think I've been doing it wrong, since Spark applications are generally run by executing spark-submit command in terminal.
So now what I'm thinking is putting the Java API .jar and Scala .jar in one folder, and then executing spark-submit from inside of Java API .jar, targeting the Scala .jar. Is this even correct? And how to execute the spark-submit from Java code? Does it have to be using Runtime.exec() as mentioned in here?
SparkLauncher can be used to submit the spark code(written in scala with precomplied jar scala.jar placed at certain location) from the Java Api code.
The saprk documentaion for using SparkLauncher recommends the below way to submit the spark job pro-grammatically from inside Java Applications. Add the below code in your Java Api code.
import org.apache.spark.launcher.SparkAppHandle;
import org.apache.spark.launcher.SparkLauncher;
public class MyLauncher {
public static void main(String[] args) throws Exception {
SparkAppHandle handle = new SparkLauncher()
.setAppResource("/my/scala.jar")
.setMainClass("my.spark.app.Main")
.setMaster("local")
.setConf(SparkLauncher.DRIVER_MEMORY, "2g")
.startApplication();
// Use handle API to monitor / control application.
}
}
I have some code in an Android project that parses HTML using Jsoup. It doesn't use anything Android specific, they're just static methods that take an InputStream, and return my model classes. The app uses Gradle to build itself in Android Studio.
Is there any way I can create a standard Java main method to do something like load HTML from a local file, run it through my parser, and output a JSON file (using Gson on my model class)? I'm thinking maybe I can add a new sourceSet to Gradle like a jvmCompatible set of classes? I would greatly prefer not to copy my code to a separate project.
EDIT:
I guess I didn't make this clear, but I would like the be able to run this locally on my dev machine from the command line, rather than on an Android device or emulator.
You don't necessarily need to do anything in the build file to set this up; the build file generates Java .class files, and you can feed them to Java directly from the command line. You can add a main method to any class:
package com.example.foo;
class MyClass {
...
public static void main(String [] args) {
...
}
}
The main method will be happily ignored util you invoke it via the Java command line. You can do this by setting your classpath to the intermediate build directory and telling the Java command line which class to start:
java -classpath app/build/intermediates/classes/debug/ com.example.foo.MyClass
where you pass in the path to the build/intermediates/classes/debug directory in your app module's build output, and the fully-qualified name of the class.
Note that if you're running a release build that uses ProGuard, this main method could get stripped out if it's not otherwise referenced in the code.
Make sure you don't access any Android classes or you'll get a runtime error.
As an aside, you might find it worthwhile to separate out your Java-only code into a Java-only module in the build. Among other things, it would let you use JUnit to write nice test cases for the classes within; if you're asking this question because you want to do some testing of your parser, you might find it convenient to do so within the auspices of a unit test.
This is my first time with Java and Eclipse. I started a brand new Java project and I want to import/add NanoHTTPD into it. How do you this?
This is NanoHTTPD's site: http://nanohttpd.com
Thanks!
Edit
Lesson learned, be specific or you get backslashed for asking. I edited the question and here's some background and the problem I'm running into.
I'm developing a Nodejs backend that needs to query a JAVA project I was given. Pipes are a no go because the services will run on different machines. Tomcat seems like an overkill so I decided to use NanoHTTPD to develop the web service. I come from Ruby & Nodejs so compilation and Eclipse are very new to me. First off, I have no JAR file just TAR and ZIP and from what I read they are fundamentally different. However, I tried importing the TAR and ZIP files as recommended but the structure I get in Eclipse does not seem right compared to the JRE System Library or others I've seen. Notwithstanding, I went ahead and tried to import the package from my Main.java file
package fi.iki.elonen;
public class Main {
public static void main (String[] args)
{
System.out.println("Main");
}
}
When I try to run it I get the following error:
Exception in thread "main" java.lang.Error: Unresolved compilation problem:
at Main.main(Main.java:4)
I found a great article from IBM "Managing the Java classpath (UNIX and Mac OS X)" where it mentions that an IDE such as Eclipse can alleviate the pain of dealing with the source path, classpath and compilation. Unfortunately, I'm afraid this is where I might be getting stuck.
I tried uploading images of what I have but apparently I'm not popular enough yet to do it.
Could someone help me figuring out how to not only import libraries but using them on projects? Even just a URL to a clear Linux/Mac OS X post that explains import with multiple packages would be great.
NanoHTTPD is designed to be very lightweight.
I just cut and pasted the 'NanoHTTPD' class from the source on github, its all in there - and pasted it as a class into my own project.
Then I created a subclass of nanoHTTPD, overrode the 'serve' method to send my own stuff and it was job done.
Download the jar, drag it into the project, and right-click it to add it to the build path.
When I attempt to deploy a Scala class to GAE I receive this error in the logs :
Uncaught exception from servlet
java.lang.NoClassDefFoundError: scala/Predef$
at com.myapp.controller.FirstTest.getString(FirstTest.scala:7)
at com.gogofindit.myapp.RedirectController.displaySearch(RedirectController.java:20)
The app works locally, its just when I deploy to app engine that I receive the error.
According to this doc Scala is supported : https://developers.google.com/appengine/docs/
Here is the Scala class :
class FirstTest {
def getString = {
println("In scala")
"search"
}
}
Do I need to update a config file within the project so that the Scala classes are compiled ?
Scala is supported without any additional effort.
But your scala classes should be added to compile process, either in pom.xml or in project config (by adding nature in Eclipse for instance).
Take a look if you can find compiled classes in bin/ or target/classes, next to java classes.
I have a Scala project and I would like to export it as a jar.
*1. At first I tried creating a Java class for the project as an entry point
public class JMain {
public static void main(String[] args) {
System.out.println("Java main calling Scala main");
SMain.main(new String[] {""}); //SMain.main is the actual *main*
and this worked fine and dandy when launched from Eclipse, but when I export it as jar it'll give me 18 exceptions or so. I do now know how to replicate then "environment" in which Eclipse manages to launch this and I'm prety sure it relies on the fact that Scala is on my system already - I need a self contained jar with everything packed in there.
*2. My second try consisted of trying what lach suggested here How to deploy a Scala project from Eclipse?
namely:
public class JMain {
public static void main(String[] args) {
System.out.println("Java Main");
List<String> argList = new ArrayList<String>();
argList.add("fully.qualified.ClassName"); //???
for (String s : args) argList.add(s);
scala.tools.nsc.MainGenericRunner.main(argList.toArray(new String[0]));
This time it won't even run from Eclipse, although it gives only 6 or so exceptions starting with the famous NoClassDefFoundError. I have a feeling I'm not getting fully.qualified.ClassName right. *3. If the main Scala class is called "Dis.scala" and is located in package "pack" shouldn't this fully.qualified.ClassName be "pack.Dis"?
I'm using Jre 1.6 and Scala 2.9.2
EDIT: I have included all external imported jars, even scala-library.jar - everything is nice and packed in the jar
P.S. I am not familiar with Ant or Maven or Sbt. I just want my Scala project jared - if possible without getting into hairy things.
Here is what worked for me:
1. Create scala project
2. Create Wrapper java project
3. Add the scala-library.jar to you java project build path.
So you only need the 3rd step in addition since the rest looks similar to what I did. Then you can happily use: java - jar file.jar
EDIT:
How to create a JAR File which contains Scala/Code which can be consumed by another Java Project, using Scala - Eclipse IDE.
Create a new Scala Project and define an object with a main method as entry point.
Now create a new Java Project and add your Scala Project to the new ones buildpath. Additionally add the scala-library.jar to the Java project.
Now create a Wrapper class in the java project which calls your entry point class from the scala lib. Run the wrapper class to create a eclipse run configuration and test if you can call the scala project.
Use the Export->Java->Runnable JAR file, Wizard now on the wrapper project.The eclipse run configuration will be used as entrypoint into the JAR. Depending on your needs you may want to :
extract required libraries into generated JAR
or
Package required libraries into generated JAR
Finally you get a complete packaged JAR which you can use like this:
java - jar wrapped.jar
For me, it was relatively straightforward.
Develop and test the project using the scala IDE (or eclipse for java).
once ready, generate the jar for the project using file -> export method.
for submitting the spark (i was writing something for spark), i just had to mention --class option for specifying the main class for the jar.
hope to help.