I am using Websphere application server. For the server configuration I am restoring it from a existing CAR file. I have some xml file where i hold all the jms, Queue and datasource related information.How can i take the information from xml file in script to configure the websphere Queue.
Now I want to write a program/script in Java so I can directly configure the WAS after running the script or program instead of restoring it from CAR file. But I don't know how to proceed. Please suggest me some approach with example so I can do the task.
You'll need to be disciplined about doing no manual configuration and using automated wsadmin scripts for all of the WAS configuration.
There is a helper library here with some reusable constructs:
https://github.com/wsadminlib/wsadminlib
And most configuration done in the WAS admin console comes with "command assistance" which shows you the underlying wsadmin commands being invoked when a change is submitted.
http://www.ibm.com/developerworks/websphere/library/techarticles/0812_rhodes/0812_rhodes.html
This means instead of a save/restore kind of operation, you're scripting the entire bringup to make it more repeatable.
Related
We are planning to create a new processing mechanism which consists of listening to a few directories e.g: /opt/dir1, /opt/dirN and for each document create in these directories, start a routine to process, persist it's registries in a database (via REST calls to an existing CRUD API) and generate a protocol file to another directory.
For testing purposes, I am not using any modern (or even decent) framework/approach, just a regular SpringBoot app with WatchService implementation that listens to these directories and poll the files to be processed as soon as they are created. It works but, clearly I am most definitely having some performance implications at some time when I move to production and start receiving dozens of files to be processed in parallel, which isn't a reality in my example.
After some research and some tips from a few colleagues, I found Spring Batch + Spring Cloud Data Flow to be the best combination for my needs. However, I have never dealt with neither of Batch or Data Flow before and I'm kinda confuse on what and how I should build these blocks in order to get this routine going in the most simple and performatic manner. I have a few questions regarding it's added value and architecture and would really appreciate hearing your thoughts!
I managed to create and run a sample batch file ingest task based on this section of Spring Docs. How can I launch a task every time a file is created in a directory? Do I need a Stream for that?
If I do, How can I create a stream application that launches my task programmaticaly for each new file passing it's path as argument? Should I use RabbitMQ for this purpose?
How can I keep some variables externalized for my task e.g directories path? Can I have these streams and tasks read an application.yml somewhere else than inside it's jar?
Why should I use Spring Cloud Data Flow alongside Spring Batch and not only a batch application? Just because it spans parallel tasks for each file or do I get any other benefit?
Talking purely about performance, how would this solution compare to my WatchService + plain processing implementation if you think only about the sequential processing scenario, where I'd receive only 1 file per hour or so?
Also, if any of you have any guide or sample about how to launch a task programmaticaly, I would really thank you! I am still searching for that, but doesn't seem I'm doing it right.
Thank you for your attention and any input is highly appreciated!
UPDATE
I managed to launch my task via SCDF REST API so I could keep my original SpringBoot App using WatchService launching a new task via Feign or XXX. I still know this is far from what I should do here. After some more research I think creating a stream using file source and sink would be my way here, unless someone has any other opinion, but I can't get to set the inbound channel adapter to poll from multiple directories and I can't have multiple streams, because this platform is supposed to scale to the point where we have thousands of particiants (or directories to poll files from).
Here are a few pointers.
I managed to create and run a sample batch file ingest task based on this section of Spring Docs. How can I launch a task every time a file is created in a directory? Do I need a Stream for that?
If you'd have to launch it automatically upon an upstream event (eg: new file), yes, you could do that via a stream (see example). If the events are coming off of a message-broker, you can directly consume them in the batch-job, too (eg: AmqpItemReader).
If I do, How can I create a stream application that launches my task programmaticaly for each new file passing it's path as argument? Should I use RabbitMQ for this purpose?
Hopefully, the above example clarifies it. If you want to programmatically launch the Task (not via DSL/REST/UI), you can do so with the new Java DSL support, which was added in 1.3.
How can I keep some variables externalized for my task e.g directories path? Can I have these streams and tasks read an application.yml somewhere else than inside it's jar?
The recommended approach is to use Config Server. Depending on the platform where this is being orchestrated, you'd have to provide the config-server credentials to the Task and its sub-tasks including batch-jobs. In Cloud Foundry, we simply bind config-server service instance to each of the tasks and at runtime the externalized properties would be automatically resolved.
Why should I use Spring Cloud Data Flow alongside Spring Batch and not only a batch application? Just because it spans parallel tasks for each file or do I get any other benefit?
Ad a replacement for Spring Batch Admin, SCDF provides monitoring and management for Tasks/Batch-Jobs. The executions, steps, step-progress, and stacktrace upon errors are persisted and available to explore from the Dashboard. You can directly also use SCDF's REST endpoints to examine this information.
Talking purely about performance, how would this solution compare to my WatchService + plain processing implementation if you think only about the sequential processing scenario, where I'd receive only 1 file per hour or so?
This is implementation specific. We do not have any benchmarks to share. However, if performance is a requirement, you could explore remote-partitioning support in Spring Batch. You can partition the ingest or data processing Tasks with "n" number of workers, so that way you can achieve parallelism.
I have a complicated .jar file that I need to run on azure (C# ASP.NET). On my local system, I simply run java.exe and pass it the jar as an argument. I would like to do the same on the server, however, I don't know where java.exe is located.
I have had a look at the environment variables and found many jdk and jre references, so I assume it is possible.
I can not use ikvm, as the jar is too complex that it isn't running correctly.
So, as a summary: Where is the java.exe located on azure? And if it's not (and I can't do this), what else can I do?
EDIT:
To clarify more: I am developing a web app using ASP.NET. I have a .jar file that I have to run, and on the local machine I run it using:
processStartInfo = new ProcessStartInfo("java");
processStartInfo.Arguments = arguments;
//more options
Process process = new Process();
process.StartInfo = processStartInfo;
process.Start();
process.WaitForExit();
Now I am publishing this website to Microsoft's azure services, and I would like to do the same thing. Except, running it as is tells me that the process can't be run (ie they don't understand what "java" is). I want to find a way to be able to call java as a process. Obviously, if I know the path to java.exe, I simply run the path as a command and I'll be done (ie it'll execute java). That's what I need help with.
As derpirscher mentions in the comment you haven't specified what type of Azure service you want to use, and you haven't specified the nature of your Java code (does it listen for incoming connections on some port? does it talk to any external services? etc.). More info would help us give you a better answer.
That said... one option to start with would be Azure Web Jobs, which allow you to upload and run (among other options) a Java .jar file:
Azure Web Jobs overview
As the info at that link indicates, you can run on-demand, continuously, or on a periodic schedule. Some additional details found here:
Executing Java Web Jobs on Azure
For more general information about both running Java code on Azure and also interacting with Azure services from within Java code, see here:
Azure Java Dev Center
Specifically, here are some additional deployment options beyond Web Jobs:
Deploying Java code on Azure
Best of luck!
EDIT based on your additional feedback:
So if I'm understanding, you want to invoke a Java .jar file by spawning a new process from an ASP.NET application when a user inputs a certain query, etc.?
I can think of two potential options:
Host your ASP.NET application and the .jar on an Azure virtual machine that you customize with the correct version of Java, etc. This would allow you to configure Java how you like, on what path you want, etc.
Decouple the resources used to host your ASP.NET application from those used to invoke the Java code by (for instance) hosting your site as an Azure Web App and writing a message from there to an Azure storage queue each time the Java code should execute. On the receiving side of the queue, you'd have an Azure Web Job configured to listen on that queue and execute your .jar file whenever a new message arrives.
Triggering a Web Job from an Azure Queue
In general option 2 will be preferable from a scalability and pure design standpoint (allows you to separate the concerns of accepting queries vs. processing them, align costs most directly with actual resource consumption, etc.) but option 1 is perhaps easier from the perspective of someone unfamiliar with Azure or cloud architecture.
Just know that, depending on the nature of the processing you have to perform, number of expected concurrent users, etc. an acceptable VM-based solution may be more expensive than something similar to option 1 above. Like so many things in cloud, its ultimately a time vs. expense tradeoff that you have to make here.
Assumption that your application in C#/ASP.NET was running on Azure App Service like Azure WebApp. So you can access the Kudu console via the url https://<your-webapp-name>.scm.azurewebsites.net/DebugConsole, then you can command cd ..\"Program Files (x86)"\Java to move to the path of the collection of Java SDKs for different versions.
Please try to use the absolute path for java.exe (like D:\\Program Files (x86)\\Java\\jdk<version\\bin\\java.exe>) as the argument for the C# Class ProcessStartInfo.
However, I still recommend that you could try to deploy the application using Azure VM and run the app via configure the related environment variables on VM.
I included a scheduled job in my WAR file through Quartz and Spring. In case that the scheduled job misses, I have to execute a method Class_A.Method_A() explicitly.
In order to execute the method, I plan to create a static method main() under Class_A so that I can execute jar -cp $CLASSPATH Class_A. However, the class is inside the WAR file, how can I do it?
In addition, the WAR file has its data source and log4j configuration and the method Method_A does database access and logging through them, if I call it on command prompt, is there any conflict?
If calling it through a comamnd prompt is not a good practice, what is a better way? Please help.
Why are you trying to execute 'jar -cp'. This will be a separate jvm execution and hence you will not be able to directly access the resources in the jvm running the web application (this means objects spawned in the jvm's memory space by the web application). [This is answer to your question about conflict/
Please mention what application server on which your web application is running.
Seeing your comment about System Administrator (though I would have mentioned this regardless of this as well). Have you ever heard of ServiceMBeans, you can try them.
Your scenario is a very generalized scenario, where people need to access a particular class (better say instance of the class) running inside a JVM. You certainly need something which loads up along with the application.
You can write a Service MBean to run along (inside) your web application. This would mean you are exposing action. Then you can write a java client to interact with the MBean and make call to its exposed methods.
In case your application server provides authentication for accessing MBeans.
Other option is JMS implementation. Setup a JMSQueue, whose listener will execute the action interacting with the classes of web application. Obviously the listener would load along side web application. EJB implementation would allow you to load the listener via simple ejb xml or through annotations.
Then you write up a separate java code which can send message commands to the JMSQueue.
All application server provides the option of authentication.
I have a webapp with an architecture I'm not thrilled with. In particular, I have a servlet that handles a very large file upload (via commons-fileupload), then processes the file, passing it to a service/repository layer.
What has been suggested to me is that I simply have my servlet upload the file, and a service on the backend do the processing. I like the idea, but I have no idea to go about it. I do not know JMS.
Other details:
- App is a GWT app split into the recommended client/server/shared subpackages, using an MVP architecture.
- Currently, I am only running in GWT hosted mode, but am planning to move to Tomcat in the very near future.
I'm perfectly willing to learn whatever I need to in order to get this working (in fact, that's the point of writing the app). I'm not expecting anyone to write code for me, but can someone point me in the right direction to get started?
There are many options for this scenario, but the simplest may be just copying the uploaded file to a known location on the file system, and have a background daemon monitor the location and process when it finds it.
#Jason, there are many ways to solve your problem.
i) Have dump you file data into Database with column type BLOB. and have a DB polling thread(after a particular time period) polls table for newly inserted file .
ii) Have dump file into file system and have a file montioring process.
Benefit of i) over ii) is that DB is centralized and fast resource where as file systems are genrally slow and non-centalized in nature.
So basically servlet would dump either to DB or file system. Now about who will process that dumped file:- a) It could be either montioring process as discussed above or b) you can use JMS which is asynchronous in nature what it means servlet would put a trigger event in queue which will asynchronously trigger new processing thread.
Well don't introduce JMS in your system unnecessarily if you are ok with monitoring process.
This sounds interesting and familiar to me :). We do it in the similar way.
We have our four projects, all four projects includes file upload and file processing (Image/Video/PDF/Docs) etc. So we created a single project to handle all file processing, it is something like below:
All four projects and File processor use Amazon S3/Our File Storage for file storage so file storage is shared among all five projects.
We make request to File Processor providing details in XML via http request which include file-path on S3/Stoarge, aws authentication details, file conversion/processing parameters. File Processor does processing and puts processed files on S3/Storage, constructs XML with processed files details and sends XML via response.
We use Spring Frameowrk and Tomcat.
Since this is foremost a learning exercise, you need to pick an easy to use JMS provider. This discussion suggested FFMQ just one year ago.
Since you are starting with a simple processor, you can keep it simple and use a JMS Queue.
In the simplest form, each message send by the servlet has to correspond to a single job. You can either put the entire payload of the upload in the message, or just send a filename as reference to the content in the message. These are details you can refactor later.
On the processor side, if you are using Java EE, you can use a MessageBean. If you are not, then I would suggest a 3 JVM solution -- one each for Tomcat, the JMS server, and the message processor. This article includes the basics of a message consuming client.
How would I go about writing some code to allow access to a Java class in my webapp from the command line.
E.g. I have a java class with command line interface, that can runs code in the context of the webapp, with access to the DB etc. I want to log on the machine hosting my WARred app in tomcat and be able to interact with it
Where should i start looking ?
Thanks
Do you just want to run class files that just so happen to be bundled in the WAR, or do you want ot interact with the actual, running WAR instance? If the former, then the WAR is just a normal Jar file and you can execute classes in that just like any other other Jar file.
If you want to interact with the running WAR, then you might want to look at JMX.
All current JDKs (at least 1.5+) come with JMX "for free". It's easy to create little interface classes to be used as commands to interact with your WAR.
THen you would need to create a command line program that connects to the WAR via JMX, or you can use a tool like JConsole (which comes with the JDK, but it's a GUI) to interact with your instance. There are other JMX clients out there as well.
If none of that is attractive, there's always web services.
A suggestion:
Your command line interface class should accept an InputStream as it's input and provide an OutputStream (it can't hardcode output to System.out and input to System.in) that it's output will be written to. Then you'll have to write a server class that listens for connections on a certain port. When a connection is made the server would take the InputStream from the connection and give it to the command line class which would provide the OutputStream that data written to will be passed to the client that made the connection.