Calling Local/Server Node JS Application from Spring Boot Java Web API - java

we have installed a node js utility on a local machine/server.
using npm install -g openapi2Arbitaryservice
I can call this utility and it is working fine. It reads a json and create a service api.
openapi2Arbitaryservice generateApi myProxySwagSep6 -s ./swagger-petstore.json -d ./aug-2021
I want to trigger this when api request comes to the tomcat server with body of JSON. I have a spring-boot application using java 8.
I was trying to use exec method of java to call this locally installed node js application when an api request comes to the server and do the processing
String[] command = { "/bin/bash", "-c", ,"openapi2ArbitaryService", "generateApi", "myProxySwagSep6", "-s",".swagger-petstore.json", "-d", "./aug-2021" };
p = Runtime.getRuntime().exec(command);
But, i get error "Exit Value 127". It seems it's not able to recognise openapi2ArbitaryService
Is this a valid use-case for java calling local/server installed application on user api requests. Should this be decoupled using a queue and then handled separately e.g as a docker task.
Is Golang/python suitable for such use-cases please share some thoughts.
thanks in advance !

Related

How can I export traces generated by the OpenTelemetry Java agent to Google Cloud Trace?

I've got a Spring Boot application that'd I'd like to automatically generate traces for using the OpenTelemetry Java agent, and subsequently upload those traces to Google Cloud Trace.
I've added the following code to the entry point of my application for sending traces:
OpenTelemetrySdk.builder()
.setTracerProvider(
SdkTracerProvider.builder()
.addSpanProcessor(
SimpleSpanProcessor.create(TraceExporter.createWithDefaultConfiguration())
)
.build()
)
.buildAndRegisterGlobal();
...and I'm running my application with the following system properties:
-javaagent:path/to/opentelemetry-javaagent-all.jar \
-jar myapp.jar
...but I don't know how to connect the two.
Is there some agent configuration I can apply? Something like:
-Dotel.traces.exporter=google_cloud_trace
I ended up resolving this as follows:
Clone the GoogleCloudPlatform /
opentelemetry-operations-java repo
git clone
git#github.com:GoogleCloudPlatform/opentelemetry-operations-java.git
Build the exporter-auto project
./gradlew clean :exporter-auto:shadowJar
Copy the jar produced in exporter-auto/build/libs to my target project
Run the application with the following arguments:
-javaagent:path/to/opentelemetry-javaagent-all.jar
-Dotel.javaagent.experimental.extensions=[artifact-from-step-3].jar
-Dotel.traces.exporter=google_cloud_trace
-Dotel.metrics.exporter=none
-jar myapp.jar
Note: This setup does not require any explicit code changes in the target code base.

Run Java Jar within Python Google Cloud Function

I'm trying to port over some old Ruby code I used to run on Heroku to a Python-based Google Cloud Function.
This code runs Apple's Reporter tool which is "a command-line tool that you can use to download your Sales and Trends reports and Payments and Financial Reports". Docs can be found here.
The Ruby code worked well for years until yesterday, running on Heroku with a Ruby + Java build pack. A small snippet of this, where options are args received :
options = [
vendor_id,
file_type,
sub_file_type,
'Daily',
trimmed_date,
version
]
Dir.chdir("#{Rails.root}/tmp/") do
stdout, stderr, status = Open3.capture3("java -jar #{Rails.root}/public/jars/Reporter.jar p=Reporter.properties m=Robot.XML Sales.getReport #{options.join(', ')}")
return {:status => status, :error => stderr.to_s, :stdout => stdout.to_s }
end
The error I'm seeing on Heroku after no code or stack updates is Network is available but cannot connect to application. Check your proxy and firewall settings and try again.
Most of our other similar processes have been moved to Google Cloud Functions, so after getting nowhere with the above error I thought I'd move this also.
So a similar snippet this time in Python:
from subprocess import Popen, PIPE
def execute_reporter_jar(vendor_id, trimmed_date, file_type, api_version):
process = Popen(["java -jar Reporter.jar p=Reporter.properties Sales.getReportVersion Sales, Detailed"], stdin=PIPE, stdout=PIPE, shell=True)
out, err = process.communicate()
print("returncode = %s", process.returncode)
print("stdout = %s", out)
print("stderr = %s", err)
This works well locally, but when I deploy to Gooogle Cloud it seemingly runs successfully in a few ms, however, nothing actually happens and when I dig deeper it seems the subprocess is returning a 127 - command not found error. So it seems the cloud function can't access Java.
After a good 24hrs, I've hit a wall with this. Can anyone help? I have zero Java knowledge and I know cloud functions have a Java runtime, but I would prefer to stick with Python.
The ultimate aim is for Apple's reporter to run and save the requested file to Google Cloud Storage.
Thanks in advance!
The execution environment for Cloud Function's with Python runtime (both 3.7 and 3.8) is currently based in Ubuntu 18.04 (check the information in this link).
The runtime only includes the following system packages and running subprocess is usually not a recommended idea as the system packages included are limited.
If it's paramount for you to stick with Python you could try to deploy your function using the BuildPack CLI and extending the builder image to install Java on the Python runtime or if your application can be dockerized consider building an image yourself with Java included and deploying your application in Cloud Run.

heroku nodejs app - events.js:167 Error Unhandled 'error' event : spawn java ENOENT

I'm having a node-js app on Heroku using the pdfMerge.js library.
following the documentation I'm using the stream event mechanism as a callback to identify the end of the process
then an exception is thrown :
events.js:167 Error: spawn java ENOENT.
I'm almost sure it's happening because I'm messing required java installation as described here:
pdfmerger combines multiple PDF-files into a single PDF-file. It is a node module that utilizes the Apache PDFBox Library, which the required functionality are distributed along with this module. The only requirement for this module to run, is having Java 6 or higher in the path.
I'm Not familiar enough with Heroku installation/configuration process in order to make it work.
thanks in advance
You can add Java to your app by adding the heroku/jvm buildpack like this:
$ heroku buildpacks:add -i 1 heroku/jvm
Then redeploy with git commit --allow-empty and git push heroku master.

Client ID must be set as api.adwords.clientId in ads.properties. AdWords on GAE

I have a Java program that makes reports from Google AdWords, the authentications are working correctly since I have no issues issuing my reports with the Java API, but now when I run my Java API on the localhost with mvn appengine:devserver, it gives me this exception:
Client ID must be set as api.adwords.clientId in ads.properties.
If you do not have a client ID or secret, please create one in the API console: https://console.developers.google.com/project caused by: [clientId]
Only thing I change from the Java app to the GAE app is the doGet method, I comment it out and change my run() that is called by the doGet to main(String args[]).
Fixed, here's how:
This issue is due to the path of properties file.
First when deploying app, I was using a server so it doesn't read the properties file from the same directory as when it's run (ran?) on eclipse with the play button or with javac your_java_app.java
I moved my ads.properties from resources to src/main/webapp/WEB-INF and changed the OAuth2Credential from
..fromFile().. to
..fromFile("WEB-INF/ads.properties")..

How can I run Play framework in HTTPS only in the dev mode?

I'd like to run Play Framework over HTTPS only in the development mode and I've done so using the following bit of configuration:
https.port=9443
trustmanager.algorithm=JKS
keystore.file=conf/certificate.jks
keystore.password=password
certificate.password=password
application.mode=dev
%prodenv.application.mode=prod
This works when I run play run but in production we run play run --%prodenv and I want to disable HTTPS as the HTTPS is handled by Nginx. I'm lost with how to do this. I would like to do this via the configuration file and not via additional command-line arguments as it does defy the purpose of having all my application configuration in the application.conf file.
One way to do it is to have two confs file: application.conf and prod.conf
application.conf stays the way it is and prod.conf would look something like
include "application.conf"
https.port = myProdPort
### other params to be overwritten
when launching your application in prod you can do
play run -Dconfig.file=/mypath/prod.conf
sbt run -Dhttps.port=9443 -Dhttp.port=disabled
Rather than have two configuration files, I achieved this by using just one. In order to run the app, I run play run --%dev and this is what the configuration looks like.
%dev.https.port=9443
%dev.trustmanager.algorithm=JKS
%dev.keystore.file=conf/certificate.jks
%dev.keystore.password=password
%dev.certificate.password=password
Similar to the other answer by Johan, I do it the reverse way: my application.conf is for prod and I run a dev.conf just in development:
include "application.conf"
https.port = devPort
And run locally like so:
play run -Dconfig.file=dev.conf
This way you don't have to change any configuration on your prod server.
You could remove the https.port param from your conf file and pass it in via the command line, when you run it in development mode:
play run -Dhttp.port=9443
See: Sprecifying server address and port
Play framework runs using Netty server you can overwrite the server configuration using -D parameters.
In sbt it can be done like:
sbt "project pepe-grillo-server" "run -Dhttps.port=42443 -Dhttp.port=disabled"
If you are using custom ssl engine provider CustomSSLEngineProvider you can use below command to run netty in ssl mode.
./sbt "-Dhttps.port=9443" "-Dplay.server.https.engineProvider=services.https.CustomSSLEngineProvider" "-Dconfig.resource=<config file> run
Once the server is up and running you can curl the endpoint to check cert validity.
curl -v https://127.0.0.1:9443

Categories