I have a system that consists of several thick client apps that communicate with each other. The apps currently communicate directly with each other over rmi but I a exploring options for using a messaging framework, specifically camel.
I know camel can be run in standalone (often when we do testing) but is often deployed in a container or esb. Is it appropriate to run camel in standalone mode if the only apps communicating with it are desktop (swing) apps?
Well yes and no. Camel is a message router, it helps you defining routes for your calls. However it will not help you choosing communication protocol. It just makes the integration quicker and faster (for example use JMS to communicate App1 <-> App2, RMI for App2 <-> App3 and some other protocol for App2 <-> App3).
Yes, Camel might be deployed in Standalone version. Here is link how to do this. I would advice to create separate application (I would also use Shade Maven plugin here to embed all dependencies).
You might also consider using some ESB, for example Servicemix or Fuse. However this is pretty big environment...
Actually, a vanilla installation (unzip that is) of ActiveMQ (5.5.1 for instance) will come bundled with Camel. Simply edit conf/activemq.xml and insert somewhere. Configure camel at conf/camel.xml. ActiveMQ does not come with all Camel components, so simply drop any additional camel jar file into /lib. You can easy just drop any of your own .jar files that for instance definies a RouteBuilder etc. in /lib to.
To communicate between applications I would use a standalone ActiveMQ broker (typically two instances for failover) and embed camel into the thick client applications. You can then use the pojo messaging features in camel to achieve almost transparent communication.
See my blog for an example http://www.liquid-reality.de/x/NoBe
Related
I planned to use jBoss to deploy, manage and monitor playframework applications, but from https://github.com/play2war/play2-war-plugin/wiki/ConfigurationLoggingJBoss7, I found jBoss and playframework doesn't play very well together.
From the official documentation, it only says Apache or Nginx can be used for HTTP server, but no application server is mentioned there. Does anyone have ideas about what would be a suiable application server for Playframework deployment? How about Apache Tomcat or GlassFish?
You don't need any application server to run a Play 2.x application. The application can run stand alone, it internally uses Netty to handle the sockets, Akka to handle the concurrency etc.
Usually Apache or Nginx are used in front of a Play application to offload serving of static resources, HTTPS handling (both can be done directly in the application itself) and above all to allow public access to multiple applications on the same IP and port under different paths.
In Play 1.0 you could build to a war to run in a container like you describe. This feature was removed in Play 2.0 to promote the embedded Netty server as the main way to deploy applications. Now you need the play2war to achieve this functionality.
I have had lots of trobule trying to get my head around how to solve this scenario:
We have an integration application that uses Camel for integration. This application also has a REST Api that exposes some services providing information about the application, for instance listing the active routes etc.
I have created a user interface for this using AngularJS that connects to these rest services. My main problem is how can I package this application as a self contained jar-file that provides the user interface and all the camel integration.
My working theory: Use a separate Jetty server to serve the Angular JS files and let Camel expose the REST services. The problem with this is CORS since the REST services reside on another port than the jetty server serving the Web UI.
Some requirements for the solution:
Must be a single self contained jar-file.
The camel integration is the main purpose, the Web UI is secondary
and only used for trouble shooting. No need for a high performance
web container since the Web ui is used by only a handful of users.
I have been struggling with this for a couple of days now and it feels like I am over complicating the solution. Help on how to solve this is greatly appreciated.
You could take a look at hawtio
http://hawt.io/
as that is how we do that, hawtio is a web console for java, and has plugins for Camel. Its built using angularjs, and uses REST to communicate with the local or remote Java JVMs. To make the REST calls easier we use Jolokia.
Jolokia requires an agent to be embedded in the JVM, eg where Camel runs. Then that helps with CORS et all. http://jolokia.org/reference/html/security.html#d0e2490
Working on a Java-based large distributed system, so there will be multiple services running across multiple machines .....
Looking for an open source framework to be able to manage these services(e.g. start/stop a service, install a new a service remotely etc.)
Apache Karaf seems to be a good choice, but underneath it uses apache felix (an OSGi reference implementation) bundle which I have a hard time to really understand. In particular, it seems to be easy to define and register a service in felix, but how do you invoke such a service remotely? Do you need to have a separate RPC mechanism to achieve that? There seems to be very few links describe it. In general how do people use OSGi? Is Apache felix out of date?
Any other framework that can be used to manage services assuming I will have my own RPC layer (say RMI based or Netty based)?
I guess you have to specify the way you define a service.
There are services, like remote services that use RPC mechanism, for example EJB remote calls.
Another way to define services is to talk of Web-Services, either using SOAP(XML) or JAXRS(JSON) as transport protocol. And there is the definition of services in OSGi which just means a separation of API (service definition) and implementation. As sum-up you could define OSGi as a SOA for Applications within the same VirtualMachine. (it gives much more, but this is one way to look at it from a service perspective)
Apache Karaf is a OSGi server that is comparable to a Application Server, only it works for OSGi applications. It brings a lot of convenience technologies on top of a choosable OSGi framework. That would be either Apache Felix or Eclipse Equinox. Both are OSGi frameworks that provide the basic OSGi infrastructure - SOA in the same JVM.
Now there are a lot of other benefits of it, like starting and stoping services, updating services.
Taking the RPC into account this can easily be achieved, by combining OSGi-Services with CXF for example. It can easily be configured to export an OSGi service as CXF service. (this is very Karaf/CXF specific)
Apache Karaf itself also supports clustering with Apache Karaf Cellar, which also provides a DOSGi service for easier communication of services across cluster groups. DOSGi stands for Distributed OSGi, which can also be achieved by using CXF and much other implementations.
Apache Felix and OSGi in general are far away of being out-of-date. A lot of Java EE Application servers use OSGi as their underlying technology to be modular and have a smaller footprint.
That would be GlassFish, Websphere, Geronimo etc...
ESB experts, need some help. I am stuck in apache servicemix(v 4.5.3). The scenario is the communication between our enterprise applications including one is web application already executing on tomcat. Two applications are main applications, and last one is Vert.x server(used for push notification). They all are executing on different machines.
The problem is how to configure servicemix, that I can use as a ESB and let applications to communicate.
What I have done till now:
1. Deployed(war) web application as a bundle in servicemix in deploy folder.
Is this the right approach for communication b/w independent applications? What I am thinking is don't deploy any war/jar in servicemix, just use as a esb. I mean, is this the necessity to deploy applications in servicemix to let communication?If yes, than how will I achieve this, I mean I am using distributed environment and servicemix is running on separate machine ?
Please guide me. I am novice in esb world.Feel free to ask if any query.
Yes you need to deploy applications in ServiceMix that has the logic how to integrate and your business logic.
ServiceMix is using Apache Karaf as the container, so its basically just an application server (OSGi based).
To build applications that integrate you very often should use Apache Camel as its comes out of the box with ServiceMix. So I suggest to take a look at and learn more about Apache Camel, and how to use that to integrate.
There is another question here on stackoverflow that can help your get started learning about Apache Camel: What exactly is Apache Camel?
Apache ServiceMix / Karaf supports deploying WAR files, and also OSGi bundles. The latter is more often used. And there is some WAR files which may not work, if they use some libraries that do not work well in an OSGi environment.
I want to build an FTP server that has no actual files in the background. Rather I want the files uploaded to it being immediately processed by my backend. The file listing of the upload directories should contain those files that are not yet processed. Deletion or moving should not be possible.
Also, on the download side I want to present those files that I'm able to deliver. But files shall be created - again by the backend - on demand.
Since I don't want to reimplement FTP, does anyone know a Java library that helps implementing the server side of the FTP protocol, that is customizable as I need it to be?
I have looked into the always helpful Jakarta Commons but they seem to focus on the client side.
Thanks
Mike
[;-)
Check out http://mina.apache.org/ftpserver/.
The Apache FtpServer is a 100% pure Java FTP server. It's designed to be a complete and portable FTP server engine solution based on currently available open protocols. FtpServer can be run standalone as a Windows service or Unix/Linux daemon, or embedded into a Java application. We also provide support for integration within Spring applications and provide our releases as OSGi bundles.
The default network support is based on Apache MINA, a high performance asynchronous IO library. Using MINA, FtpServer can scale to a large number of concurrent users.
Maybe you can use Apache FtpServer.
The Apache FtpServer is a 100% pure Java FTP server. It's designed to be a complete and portable FTP server engine solution based on currently available open protocols. FtpServer can be run standalone as a Windows service or Unix/Linux daemon, or embedded into a Java application. We also provide support for integration within Spring applications and provide our releases as OSGi bundles.
Above mentioned links to Java FTP Server doesn't work because they have been moved to below:
http://mina.apache.org/ftpserver-project/index.html
Here's a couple which might be helpful:
http://drftpd.org/
http://mina.apache.org/ftpserver/
Since you do not actually want the files to be transfered and listed in a usual FTP behaviors, you'll need to intercept the codes of how the files are listed and retrieved by the clients.