I am implementing a workflow of tasks in a Java program.
One of the tasks is running some commands on multiple Linux servers.
The servers are determined dynamically - read from an XML file.
I examined Rundeck's API, but it seems that I have to configure the servers in advance, which doesn't suit my needs.
Any ideas?
One answer is to use rundeck, I think you just need rundeck to dynamically load your nodes.
To make it dynamic you need to refresh the project. Simply restart rundeck or use the api refresh method. I haven't tried the latter which I think is the better solution. The Resource Model supports files and getting your node list of servers from an external source URL.
But if your are rolling your own a better answer maybe SSH and consider a SSH library for Java
If you consider using an external API look into using JSCAPE's SSH Factory for Java. Documentation can be located here.
Related
I would like to build a centOS 7 instance on AWS and install Apache to build web server.
After that, I would like to modify the config file, /etc/hosts and /etc/httpd/conf.d/test.conf where test.conf is created by me.
Can I use java to modify the file directly? Or I should create the file and replace the old file on instance? I am little bit confused for the feasibility. Please someone help.
There seem to be a few questions here, so I've split them out.
Q: Can I use programming language X to modify a file on the local filesystem?
A: Yes, with very few exceptions. For Java, yes (if the instance has a JRE).
Q: Should I use Java?
A: Probably not the first choice (you could probably do what you need in a shell script at launch).
Q: Should I create the Apache config files dynamically or build them into an AMI?
A: Difficult to answer without more information. There are pros and cons to AMIs. If it's simple and quick to create/modify the files on launch, then I'd do it that way.
DevOps is a big subject and there are many options available to you for bootstrapping EC2 instances. Pre-baked AMIs is one option. Another simple option that you might consider is to write userdata scripts, that run at launch time, and that set up the instance for you (see simple nginx example). They can install software, modify config files, start services, and other things. They can also pull collateral such as pre-staged config files from S3, which can be a handy option.
I have an embedded system using a python interface. Currently the system is using a (system-local) XML-file to persist data in case the system gets turned off. But normally the system is running the entire time. When the system starts, the XML-file is read in and information is stored in python-objects. The information then is used for processing. My aim is to edit this information remotely (over TCP/IP) even during process. I would like to use JAVA to get this done, and i have been thinking about something to share the objects. The problem is, that I'm missing some keywords to find the right technologies to get this done. What i found is SOAP, but i think it is not the right thing for this case, is that true? I'm grateful for any tips.
As I understand, you are using XML file to store start up configuration
And my assumptions on your interface between Java & Python apps
You want your Java application to retrieve objects over Python interface
And process them locally and send it back to Python interface to reload config ?
So, depending on your circumstances, you can workout something with the following
Jython
Pickle (if you have no restriction on startup config file format or can afford to do conversion)
https://pypi.python.org/pypi/Pyro4
Also you can get some ideas from here:
Sharing a complex object between Python processes?
You should ask your python application to open a XML-RPC socket which clients can connect on. This could let an outside application to execute an endpoint, which would manipulate your python object values in someway. There are several good choices for Java XML-RPC libraries, including the amazing org.apache.xmlrpc library.
My application currently reads data by copying filesystem tree from remote machine via shared disk, so it works as filesystem deep copy from application's point of view.
This solution is somewhat limiting and I want to support also second option - copy subtree via http.
The library should do something like wget --recursive which parses the directory listing and use it for traversing down the tree.
I could not find any java library doing this.
I am able to implement such functionality myself (with NekoHTML or something similar), but I don't like reinventing the wheel.
Is there such a library that I can easily use within my application ?
Ideally:
published in Maven Central Repository as I am using Maven for builds
with as few dependencies on other libraries as possible
no need for robots exclusion support - will operate on limited set of interim servers only
Thanks.
Note: please post pointers to homepages of libraries which you personally used.
The Norconex HTTP Collector traverses websites like a tree, given one or more start URLs. It can be used as a java library in your application, or as a command line application. You can decide what to do with each document it crawls. Being a full-blown web crawler, it probably does more than what you are after, but you can configure it to suit your need.
For instance, it will by default extract text found in your documents and it let's you decide what to do with that text via plugging a "Committer" (i.e. where to "commit" the extracted content). In your case I think you want to the raw documents only and ignore the text conversion part. You can do so by plugging in your own document processor, followed by "filtering out" documents so they stop being processed once you have dealt with them your own way.
The project is open-source, hosted on Github and is fully "mavenized". It supports robots.txt, but that can turn that off if you want. The only downside to you is having more than a few dependencies, but since you are using Maven, those should get resolved automatically without effort. You'll find Maven repository info on the product site.
I just got a requirement to create a small (I assume standalone) utility to hit some code in our web application to do some custom processing of files from the app and then dump the files into a shared drive. My question is what is the best way for doing this? Do I just create a small app and then jar it up and run it off a command line or is there a better way?
Sorry, I didn't give enough detail. It's an old application, like over 10 years, so while it's been upgraded to jdk 1.6, most of the code uses the old collections, old loops, etc... There aren't any interfaces, very tightly coupled code that uses inheritance with lots of nested objects. The web app will do the processing. I think what they want is create some code outside of the application code that will login and then fire off the file processing code. Prior to this I had upgraded their version of Windward Reports in a separate branch and they want to make sure that the processed files: contracts, forms, etc.. don't get altered greatly as there are legal requirements on fonts and layouts. So this utility will go in, fire off the list of reports (a few thousand) dump it to a share drive so they can view them with another tool for comparision based on rules you can automate with that commercial tool, en masse. I was thinking create a small class with a main method, then jar it up and while the web server is running with my upgraded branch code, run the utility off the command line to fire it off.
There's not enough to go on here. How is the web app's functions exposed? If it's a REST interface then wget/curl/spring-rest-template are the way to go. If it's something like a JFS app then you're going to need something like Selenium to imitate a browser. If the functionality is in a shared library (JAR) then there web never even comes into play.
Well, I was originally looking at creating a standalone utility jar that I would run off the command line to connect with URLConnection to the app, but I found there is already testing code built into the application that I can run from a command line as long as I deploy the new code with the existing code. The utility will dump out the files to a shared drive and then XTest can be run to compare files. After reviewing the capabilities of XTest, it appears that it can handle the comparison of files well.
I need to add/delete domain names to BIND 9.x DNS server. That means i need to read/write to zone files. How can I do this operation using Java?
It is possible to read zone files with sftp (JSch) but parsing, appending files is too complex.
Webmin uses perl scripts and perform these operations successfully.
DNSBoss uses Java and perform these operations successfully but it is not open source.
It seems dnsjava is helpful but I am not sure, I need to dig it more.
Do you have an idea for this operation? Any open source libraries, APIs which can be helpful?
Thanks a lot,
I realized there is a patch named Bind DZL. I should use it and forget about manipulating files.