I have a Spring application with two controllers. I want to run one controller on localhost:8080 and second controller on localhost:8081.
Am I able to configure Tomcat to serve two ports simultaneously i.e 8080 and 8081? Is it possible? How?
Please note that it is not a Spring Boot application.
It sounds like two completely different applications.
You certainly could configure your Tomcat's server.xml file to have multiple HTTP connectors running on different ports. But you'll find it much easier and hassle-free to deal with two different Tomcat instances.
The App Server (Tomcat, JBoss,Glassfish) run on / watch one port. You can run multiple app servers on a single node (computer) with different port numbers for this reason. They could be the same (Tomcat+Tomcat) or different ones as well (Tomcat+Glassfish)
But in this case you need to split the controllers into 2 different applications and deploy them on the app server instances.
This is the MicroServices architectural desing style. When you run a separate app server for every service. Microservices services most of the cases use REST over HTTP to communicate to each other.
But in case of Tomcat (maybe not by all of the products) it is possible : Running Tomcat server on two different ports
No. spring runs on a specific port and that will be port for both rest controllers . You can have different URLS for them though.
It's not possible.
Spring MVC, as many other web frameworks, is designed around the front
controller pattern where a central Servlet, the DispatcherServlet,
provides a shared algorithm for request processing, while actual work
is performed by configurable delegate components.
https://docs.spring.io/spring/docs/current/spring-framework-reference/web.html
Spring itself doesnot run on any port. It is just a technology to create APIs. Port binds with server (like Tomcat, JBoss, etc). So if you want to use different ports for different controllers, then you need to deploy multiple applications across multiple servers and make those servers listen different ports.
On the application that should be on 8081, in the application.properties file add the following line:
server.port=8081
Then Just run both of them...
Otherwise in the TomcatConfiguration set the port to 8081, and again run both of them.
You can find the perfect example in below link. They use different port for different resources. It uses port binding with embedded tomcat in spring boot. Hope this helps you.
https://tech.asimio.net/2016/12/15/Configuring-Tomcat-to-Listen-on-Multiple-ports-using-Spring-Boot.html
Yes, you can, but they will behave like two separate applications and are independent of each other. However they can share common resources like databases, Password directories etc.
However for a use case such as this I would recommend to look into microservices.
Read more about microservices here
One approach is to create additional org.apache.catalina.connector.Connector and route requests from it with org.springframework.web.servlet.mvc.condition.RequestCondition https://stackoverflow.com/a/69397870/6166627
Related
I have two different APIs. They each have their own .war file and are both running on the same tomcat instance.
Strangely, I am able to reach one API with requests like this: https://(ip-address):443/(path1)
but the other responds only to this: http://(ip-address):8090/(path2)
Also complicating things is that, when I deploy the second war to a certain other tomcat instance on another server, it will respond to https 443 requests.
Any idea how this is possible?
This is strange, because at different times either the war or tomcat works as intended (by using https), so it is unclear whether to blame the war or tomcat.
Applications can declare they need confidential connections (HTTPS). Look at the WEB-INF/web.xml inside.
So one of the applications might use both because there is no constraint defined, the other may just respond to https as the container is responsible to ensure secured communication. I'd be more surprised to hear that one of the applications responds to http only.
From https://tomcat.apache.org/tomcat-9.0-doc/config/http.html#Introduction:
One or more such Connectors can be configured as part of a single Service, each forwarding to the associated Engine to perform request processing and create the response.
Check in your server.xml whether you have several services with http and https connectors that are mapping to different engines, and whether the applications are deployed distributed on these different engines. That could explain one application responding to http only, while the other is responding to https only.
I have a application with spring boot and apache cxf and java.
It seems that one application should be launched as two processes. So I thought I had to open two server ports. One is 8090, the other is 8080.
Also there is one service using one WSDL.
How can the client differentiate the different listening ports of this server when calling this service from the client?
It's better to think about scaling. It means that you want to make the system can handle more incoming requests .
There are several scaling solutions instead of serving two different ports, depending on your infrastructure and your needs. Scaling can be done Vertically or Horizontally.
You can check this answer that describe spring boot scaling in detail.
My question is simple: is it perfectly safe to have 2 independent Jetty Server instances in one JVM process, listening on different ports with independent URL mappings and SSL/TLS setup? I'm not seeing odd behaviour but before deploying to live, I'd like to get some assurance that what I'm doing is sound. If not, would it be proper to have the same set-up using a single Server instance with somehow separate URL namespaces and security SSL/TLS setup?
Yes, absolutely. We do this in many unit tests throughout Jetty.
I have one webserver with 2 instances of tomcat running. On each tomcat instance I have multiple web apps or web services.
What is the best way to call a function (or trigger some event with parameters) from a webapp of the first tomcat server on a webapp running on the second tomcat server. If it's for example a call using a url with parameters then this call should be secure and not accessible from outside the server.
I've read something about getting the servlet context but is this possible on different tomcat instances? Im thinking that this is only possible with webapps running in the same instance.
I dont want to use CORBA, RMI or SOAP because this is a bit oversized for my problem ... that is what Im thinking :)
Code examples are welcome. Thank you!
The ServletContext is only valid within the same container and can't be shared between two JVMs. The simplest method to do what you're asking is to just use some variety of RPC between the two containers, and RMI doesn't seem like particular overkill. The other usual approach would be a simple HTTP Web service (note the lowercase "s") that invokes your logic in the receiving container.
Spring's HTTPInvoker is great for this. You can use a Java interface, and your code on each instance doesn't need to know the call is remote - it just calls Java methods.
For security, you can use the Sun HTTP server on a different port (instead of using a servlet within Tomcat) and listen only on localhost.
Have a look here
http://static.springsource.org/spring/docs/3.2.x/spring-framework-reference/html/remoting.html#remoting-httpinvoker
Use Simple REST services , not that much secured .
I have a problem. I need to host many (tens, hundreds) of small identical JAVA web applications that have different loads during one time. I want to use Glassfish V3. Do I need to use a load balancer and clusters or something else? Advise where can I find information about similar problems and their solutions...
I need to host many (tens, hundreds) of small identical JAVA web applications that have different loads during one time.
For hundreds of webapps, you will very likely need more than one app server instance. But this sounds odd to be honest.
I want to use Glassfish V3. Do I need to use a load balancer and clusters or something else?
Right now, GlassFish v3 offers only basic clustering support using mod_jk (i.e. no load balancer plugin, no centralized admin, no high availibility). If you are interested, have a look at this note that describes the configuration steps of GFv3 and mod_jk.
For centralized admin and clustering, you'll have to wait for GlassFish 3.1 (see the GlassFish Roadmap Community Update slides).
You could check out Gigaspaces. I have seen it used in conjunction with Mule for a somewhat similar project. ESBs tend to be overkill in my opinion, but it sounds like you have quite the task to conquer.
Based on your requirements, you cannot do load balancing since the load is predetermined by which client the request is for. Each request has to go to the app handling that client, so it cannot be distributed outside the set of apps dedicated to that client.
You can use multi-threading. you could set up the configuration so that different threads handle different clients. However, it might be better to simply have a server that can handle requests from different clients. Based on the client sent with the request, it would be dispatched to a different database etc.