How to get all services implementing given interface, not just active ones? - java

I have a business requirement that as part of some processing, we enable a "plugin" functionality for an externally-managed code.
It was decided that the best approach would be to #Reference a list of "ranked" (ordered according to an irrelevant algorithm) services. More or less like this:
public interface ExternalProcessor {
void doSomething();
}
#Component
#Service(ImportantTaskManager.class)
public class ImportantTaskManager{
#Reference(cardinality = ReferenceCardinality.OPTIONAL_MULTIPLE, referenceInterface = ExternalProcessor.class)
protected List<ExternalProcessor> processors;
public void doImportantStuff(){
for(ExternalProcessor processor: processors){
processor.doSomething();
}
}
}
To keep it short, I've ommitted as much boilerplate as I could, including the bind/unbind method pair.
Now, another business requirement is that we don't perform any kind of processing whatsoever if there are services that implement the ExternalProcessor interface that are not bound to our main processor (for any reason: not resolved dependencies, crashed during activation, missing required configuration etc.). I have a feeling it's kind of against OSGi principle (where OSGi provides only available services as opposed to info about the unavailable ones), but how can I achieve that?
So far I've come up with the following candidates for solutions:
Ask the external team to provide a count of services we're supposed to be expecting, then compare that against what we get from OSGi - it's unreliable
Crawl through all the installed bundles and their meta xmls taken from bundle's headers looking for service definitions - it's... well, it's time-consuming, for a start.
grep through the logs looking for service registrations and/or failures - this one seems just... wrong.
Is any of the above a proper solution to this? Are there better solutions? How else can I tacke this problem? What am I missing here?

I had a similar requirement for security plugins. The code calling the plugins should not run when the necessary security plugins were missing.
I solved it by defining a service property like id. Each plugin would have a unique id. In the config of your main code you specify a list of security plugins by id that you need.
The code then checks each service for the ids and only activates the main component when all mandatory plugins are present.

Related

Where is the Spring Actuator Controller endpoint and can I call it programmatically with jvm call?

I want to find the actual java class that serves the Spring Actuator endpoint (/actuator).
It's similar to this question in a way, but that person wanted to call it via a network HTTP call. Ideally, I can call it within the JVM to save on the cost of setting up an HTTP connection.
The reason for this is because we have 2 metrics frameworks in our system. We have a legacy metrics framework built on OpenCensus and we migrated to Spring Actuator (Prometheus metrics based on Micrometer). I think the Spring one is better but I didn't realize how much my company built infrastructure around the old one. For example, we leverage internal libraries that use OpenCensus. Infra team is depending on Opencensus-based metrics from our app. So the idea is to try to merge and report both sets of metrics.
I want to create my own metrics endpoint that pulls in data from Opencensus's endpoint and Actuator's endpoint. I could make an HTTP call to each, but I'd rather call them within the JVM to save on resources and reduce latency.
Or perhaps I'm thinking about it wrong. Should I simply be using MeterRegistry.forEachMeter() in my endpoint?
In any case, I thought if I found the Spring Actuator endpoint, I can see an example of how they're doing it and mimic the implementation even if I don't call it directly.
Bonus: I'll need to track down the Opencensus handler that serves its endpoint too and will probably make another post for that, but if you know the answer to that as well, please share!
I figured it out and posting this for anyone else interested.
The key finding: The MeterRegistry that is #Autowired is actually a PrometheusMeterRegistry if you enable the prometheus metrics.
Once you cast it into a PrometheusMeterRegistry, you can call its .scrape() method to return the exact same metrics printout you would when you hit the http endpoint.
I also need to get the same info from OpenCensus and I found a way to do that too.
Here's the snippet of code for getting metrics from both frameworks
Enumeration<MetricFamilySamples> openCensusSamples = CollectorRegistry.defaultRegistry.filteredMetricFamilySamples(ImmutableSet.of());
StringWriter writer = new StringWriter();
TextFormat.write004(writer, openCensusSamples);
String openCensusMetrics = writer.toString();
PrometheusMeterRegistry registry = (PrometheusMeterRegistry) meterRegistry;
String micrometerMetrics = registry.scrape();
return openCensusMetrics.concat(micrometerMetrics);
I found out another interesting way of doing this.
The other answer I gave but it has one issue. It contains duplicate results. When I looked into it, I realized that both OpenCensus and Micrometer were reporting the same result.
Turns out that the PrometheusScrapeEndpoint implementation uses the same CollectorRegistry that OpenCensus does so the both sets of metrics were being added to the same registry.
You just need to make sure to provide these beans
#PostConstruct
public void openCensusStats() {
PrometheusStatsCollector.createAndRegister();
}
#Bean
public CollectorRegistry collectorRegistry() {
return CollectorRegistry.defaultRegistry;
}

Looking for the best way to share an interface between microservices with quarkus

I'm still quite new to microservices and have a few basic architectural questions that I can't get solved right now.
I'm using the Quarkus framework with the standard extensions like quarkus-resteasy and quarkus-rest-client for the realization.
The scenario:
I have an example of a "Persistence" service that I want to externally populate with data via a REST call in a dedicated Maven project.
#Path("/api/persistence")
#Products(MediaType.APPLICATION_JSON)
public class Persistence{
#Inject
EntityManager entityManager;
#POST
#Transactional
public Response create(PostDto postDto) {
Post post = toPostMapper.toResource(postDto);
entityManager.persist(post);
return Response.ok(postDto).status(201).build();
}
}
At the same time I would like to have a microservice DataGenerator which generates the corresponding data and passes it to the Persistence Service.
My problem : API sharing
Both services were created as Maven projects.
According to the tutorials I found the correct way would be to declare an interface (here called PersistenceApi) in the DataGenerator project like this
#Path("/api/persistence")
#Products(MediaType.APPLICATION_JSON)
#RegisterRestClient
public interface PersistenceApi {
#POST
#Transactional
public Response create(PostDto post) ;
}
This interface is then integrated into the DataGenerator service via #Inject, which leads to the following exemplary service.
#RequestScoped
#Path("/api/datagenerator")
#Products("application/json")
#Consumes("application/json")
public class DataGenerator{
#Inject
#RestClient
PersistenceApi persistenceApi
#POST
public void getPostExamplePostToPersistence() {
PostDto post = new PostDto();
post.setTitle("Find me in db in persistence-service")
persistenceApi.create(post);
}
}
I have the PersistenceService running locally on port 8181 and have added the following entry in the application.properties of the DataGenerator project so that the service can be found.
furnace.collection.item.service.PersistenceApi/mp-rest/url=http://localhost:8181
furnace.collection.item.service.PersistenceApi/mp-rest/scope=javax.inject.Singleton
I find it "wrong" to declare the interface in my DataGenerator, because at this point I don't notice when the api provided by the Persistence service changes. Accordingly one could come up with the idea to position the interface in the Persistence service, which is then implemented by my concrete Persistence implementation and leads to the following code.
#Path("/api/persistence")
#Products(MediaType.APPLICATION_JSON)
#RegisterRestClient
public class PersistenceApiImpl implements PersistenceApi {
#Inject
EntityManager entityManager;
#POST
#Transactional
public Response create(PostDto fruit) {
Post post = toPostMapper.toResource(fruit);
entityManager.persist(post);
return Response.ok(fruit).status(201).build();
}
}
In order to use them in my DataGenerator project, I would have to include the Persistence project as a dependency in my DataGenerator project, which sounds like a "monolith with extra steps" to me and therefore feels wrong in terms of "separation of concerns".
I have tried the following approach:
I created another Maven project called PersistenceApi which only contains the corresponding PersistenceApi. This PersistenceApi project was then included as a dependency in both the "Persistence" and "DataGenerator" projects. In the "Persistence"-Project I implement the service from the example above and try to address the corresponding interface in the "DataGenerator"-Project via #Inject.
Unfortunately this does not work. When I'm building the service, I get the message that the required dependency PersistenceApi, which I want to include via #Inject in the DataGenerator service, cannot be injected in the form of an UnsatisfiedResolutionException.
Now my questions:
I don't see what I'm missing here. Could you help me?
Is this kind of API-sharing with dedicated Api projects a viable way or is the "monolith with extra steps" approach really the way to go?
Thank you in advance.
Thats a common problem with microservices. Like in the book "Microservices: Grundlagen flexibler Softwarearchitekturen" by Eberhard Wolff (I saw that you are German too) i follow the idea that microservices should have the same coupling like the teams developing them and like the organization your developing it for(have a look at Conway's law). Therefore services of mostly independent teams should be developed independly and the api changes of one service should not affect another at the time of the update.
If you develop both services in your team then i think you can couple them the way you are doing it because you dont have to work together with other teams and there will be no huge overhead. Note that you will be forced to release both services together. If that is always ok for you then save your time and do it your way, if not have a look at API-Versioning:
I use api versioning so the old api is still reachable under "v1/" and the new one under "v2/". This way the team behind the other microservice has enough time to update their service.
Have a look at Domain-driven Design for different ways of integrating bounded contexts (=services) and the coupling consequences. Without API-Versioning you are forced to a partnership and you need to release together. Maybe you prefer Customer-Supplier or even conformist.
To test compatibility between both services have a look at consumer driven contracts and Pact. You can also generate open api files and track their changes but that will only help to notify people about changes.

How to develop a web application with a bunch of feature modules

I am developing a web application using spring framework and google app engine. I am wondering if there is a design pattern or framework with the help of which I can develop features of my application as pluggable modules. For example I have identified 4 features of the application:
Oauth Login
User Profile Management
User Group creation
User File management
Now what I need is to develop all these features as independent modules, so that i can detach any of them dynamically and they are as loosely coupled as possible. They can have their own database implementation, their own set of technologies etc. Is there a design principle to implement modules in such a way.
You can take a look at MQ systems (such as RabbitMQ, ActiveMQ).
MQ system will work as intermediate layer, which provide you loosely coupling.
Communication between modules will be implemented as posting messages to queue and listening for posting.
Also, OSGI may help you. It gives you possibility to make your application as a set of pluggable modules, which might be loaded dynamically.
As per my experience, I suggest, Use MVC pattern. Use Servlet filttersfor 1.Oauth Login.
Create service/POJOs to implement and inject each other according to your requirement for
2.User Profile Management
3.User Group creation
4.User File management
If you know Spring AOP, use. So that you can achive more dynamic integration between implementations of points 2,3, and 4.
You should split the feature in two components: API and implementation. The first one contain interfaces, the second their implementations. You pass the interface to web app controller and inject implementation via Spring or any other Dependency Injection framework. For example
web-app, UserController which handles requests from client and delegate to your components
#Component
public class UserController {
private FileManager fileManager;
#Autowired
public UserController(FileManager fileManager) {
this.fileManager = fileManager;
}
#GET("/user/{userId}/file/{fileId}")
public File getUserFile(long userId, long fileId) {
fileManager.getUserFile(userId, fileId);
}
}
file-mgt-api where you define interfaces to decouple web-app from implementation
public interface FileManager {
File getUserFile(long userId, long fileId);
}
file-mgt-impl where all the details of how to get requested file
#Component
public class FileManagerImpl implements FileManager {
#Override
public File getUserFile(long userId, long fileId) {
// get file by id from DB
// verify that provided user is the file owner
// do other useful stuff
// return the file or throw exception if something wrong
}
}
Do the same for group, profile management and other features. After that you can easily replace implementation by replacing single jar file. Your web-app is completely decoupled and don't know anything about implementation details, it only depends on interfaces.

Spring: How to export objects methods via non-HTTP protocol

I have a collection of stateless scala/Java APIs, which might look like:
class UserService {
def get(id: UserIDType): User { ... }
def update( user:User): User { ... }
...
}
where User has a set of inspectable bean properties. I'd like to make these same APIs not only available over HTTP (which I know how to do), but also other, more performant non-HTTP protocols (ideally also running in the same process as the HTTP server). And more importantly, automate as much as possible including generation of client APIs that match the original Java API, and the dispatching of method calls following network requests.
I've found Spring's guide on remoting
However, this looks limited to HTTP (text not binary). What I'd really love is a library/other method that:
lets me scan for registered/annotated services and describes methods and parameters as data
lets me easily dispatch method calls (so that I'm in control of the sockets, communication protocols and whatnot and can chose a more performant one than HTTP).
i.e. the kinds of things that Spring's DispatcherServlet does internally, but without the HTTP limitations.
Ideas?
Background
I'd like to make a set of stateless service API calls available as network services with the following goals:
Some of the APIs should be available as REST calls from web pages/mobile clients
A superset of these APIs should be available internally within our company (e.g. from C++ libraries, python libraries) in a way that is as high-performance (read low-latency) as possible, particularly when the service and client are:
co-located in the same process, or
co-located on the same machine.
automate wrapper code for the client and server. If I add a method a service API, or and an attribute to a class, no-one should have to write additional code in client or server. (e.g. I hit a button and a new client API that matches the original Java/scala service is auto-generated).
(1) is easily achievable with Spring MVC. Without duplication of classes I can simply markup the service: (i.e. service is also a controller)
#Controller
#Service
#RequestMapping(...)
class UserService {
#RequestMapping(...)
def get(#PathVariable("id") id: UserIDType): User { ... }
#RequestMapping(...)
def update( #RequestBody user:User): User { ... }
...
}
With tools like Swagger (3) is easily achievable (I can trivially read the Swagger-generated JSON spec and can auto-generate C++ or Python code that mimics the original service class, it's methods, parameter names and the POJO parameter types).
However, HTTP is not particularly performant, so I'd like to use a different (binary) protocol here. Trivially, I can use the this same spec to generate a thrift .idl or even directly generate client code to talk Thrift/ProtocolBuffers (or any other binary protocol) while also providing to the client an API that looks just like the original Java/scala one.
Really -- the only tricky part is getting a description of the service method and then actually dispatching a method calls. So -- are there libraries to help me do that?
Notes:
I'm aware that mixing service and controller annotations disgusts some Spring MVC purists. However, my goal really is to have a logical API, and automate all of the service embodiments of it (i.e. the controller part is redundant boilerplate that should be auto-generated). I'd REALLY rather not spend my life writing boilerplate code.

java ee | ejb3 | runtime dispatch

I would like to call an ejb3 at runtime. The name of the ejb and the method name will only be available at runtime, so I cannot include any remote interfaces at compile time.
String bean = 'some/Bean';
String meth = 'doStuff';
//lookup the bean
Object remoteInterface = (Object) new InitialContext().lookup(bean);
//search the method ..
// foreach (methods)
// if method == meth, method.invoke(bean);
the beans should be distributed accross multiple application servers, and all beans are to be called remotely.
Any hints? specifically i do not want:
dependency injection
inclusion of appliation specific ejb interfaces in the dispatcher (above)
webservices, thats like throwing out processing power for nothing, all the xml crap
Is it possible to load an ejb3 remote interface over the network (if yes, how?), so I could cache the interface in some hashmap or something.
I have a solution with a remote dispatcher bean, which I can include in the above main dispatcher, which does essentially the same, but just relays the call to a local ejb (which I can lookup how? naming lookup fails). Given the remote dispatcher bean, I can use dependency injection.
thanks for any help
(netbeans and glassfish btw)
ejb3 calls use RMI. RMI supports remote class loading, so i'd suggest looking into that.
also, JMX mbeans support fully untyped, remote invocations. so, if you could use mbeans instead of session beans, that could work. (JBoss, for instance, supports ejb3-like mbeans with some custom annotations).
lastly, many app servers support CORBA invocations, and CORBA supports untyped method invocations.
You might be able to use java.rmi.server.RMIClassLoader for the remote class loading. You'll also need to load any classes that the remote service returns or throws.
It cannot be done. You will always get a "class not found" exception. That is in my opinion the biggest disadvantage of EJB/Java. You loose some of the dynamcis, because the meta-language features are limited.
EJB3 supports RMI/IIOP, but not RMI/JRMP (standard RMI) so dynamic class loading is not supported.You loose some Java generality, but gain other features like being able to communicate about transactions and security.
You need to use reflection for this, and it is very simple to do. Assuming that you're looking for a void method called meth:
Object ejb = ctx.lookup(bean);
for (Method m : ejb.getClass().getMethods()) {
if (m.getName().equals(meth) && m.getParameterTypes().length == 0) {
m.invoke(service);
}
}
If you're looking for a specific method signature just modify the m.getParameterTypes() test accordingly, e.g. for a method with a single String parameter you can try:
Arrays.equals(m.getParameterTypes(), new Class[]{String.class})
And then pass an Object[] array with the actual arguments to the m.invoke() call:
m.invoke(service, new Object[]{"arg0"})

Categories