I just start learning how to build a bnd OSGI project.
I try to run a very simple project without any error message,but when I go to localhost, it shows "HTTP ERROR: 404".
the simple class:
an Activator class:
rest build dependencies
Run dependencies
http error:
Thanks for your helps!!
The latest 2.0.4 release of the org.amdatu.web.rest.wink bundle doesn't play well with Felix Http Jetty 3.x.
If you pin the version of that bundle to the 2.0.3 version things should work as expected. To do this change the org.amdatu.web.rest.wink entry your runbnd.bndrun -runbundles to:
org.amdatu.web.rest.wink;version='[2.0.3,2.0.3]'
Your class is annotated with jax-rs annotations and publishes an OSGi service. If this exposes the services as a REST resources depends on the bundles you install.
You have to install a bundle that watches for such services and creates the REST endpoints for them.
See enter link description here
I think you at least need to also add the org.amdatu.web.wink bundle to your bdnrun file.
Related
I'm writing custom Keycloak provider which needs quarkus-cxf extension as a dependency to integrate with a SOAP service. Everything works well in the development mode (I use maven, so I needed to add quarkus-cxf as a dependency and run Keycloak on Quarkus from IDE using IDELauncher).
The problem starts when I'm trying to deploy my custom provider to the production environment. I know that my provider's jar need to be placed in providers directory within Keycloak distribution. Somewhere on Keycloak webpage I saw that every third-party dependencies jars I use, need to be placed in providers directory as well, so I did that and (thanks to maven-dependency-plugin) I generated all quarkus-cxf dependencies and put them all into providers directory. Then I tried to build keycloak by running ./kc.bat build and it failed with the message: ERROR: io.smallrye.config.SmallRyeConfigFactory: io.quarkus.runtime.configuration.QuarkusConfigFactory not a subtype.
How it should be done to work properly? I need CXF dependencies in the build phase because I generate code from WSDL file and it's necessary for Quarkus augmentation. Maybe not all dependencies from quarkus-cxf dependency hierarchy are needed here, but I cannot imagine developing and then adding one by one jar and check if it's enough for build or maybe not and I will get NoClassDefFound error :) I've got the same problem with my previous extension where I used resteasy dependencies but then only few jars were missing so I put them into providers directory and it worked. It's not the best solution for each and every one extension with extra dependencies like cxf (with huge hierarchy of transient dependencies).
Does anyone help me in this case? Maybe some of you have the same problem and you know how it should be done.
I am using Google App Engine gradle plugin with yaml file, but the plugin version for it has no task appengineRun or appengineStart like the appengine-web.xml version.
TL;DR appengineRun is only available for appengine-web.xml based projects. If you want to use app.yaml, you must provide your own server, for example Spring Boot with Jetty or Tomcat.
To run your application locally, you must provide your own server.
This guide shows how to test your application using app.yaml alongside with the app-gradle-plugin, on section Testing your application with the development server:
During the development phase, you can run and test your application at any time in the development server by invoking Gradle:
gradle jettyRun
Alternatively, you can run Gradle without installing it by using the Gradle wrapper.
As said on this comment on GitHub:
If you want to use app.yaml from your root directory, you must upgrade to Java 11. Learn more here. With the Java 11 runtime, you must provide your own server, for example Spring Boot with Jetty or Tomcat. The appengine:run goal does not work for app.yaml based projects because each server has a different start up command i.e. spring-boot:run for Spring Boot.
I built a plugin for a web app that uses tomcat.
The plugin is registered as a servlet bean.
Now I want to use rabbitmq with the latest amqp client lib. Which has a dependency on classes in slf4j-api-1.7.25.jar.
Unfortunately the web app has also a dependency on slf4j but an older version.
So adding the the new jar file crashes the web app.
Is there anything to rescue? I have two dependencies out of my control.
No.
Use an older version of the amqp client which has the dependencies you like.
Then at your leisure upgrade the web app to the version of slf4j pulled in by the client. Might even be due diligence.
I'm currently developing a plugin system in which I embed apache felix in my application. The plugins itself are the OSGi bundles. So far deploying the bundles works just fine but I have trouble interacting with my bundles/plugins. I tried two approaches:
Register the a service "Plugin" in my Plugin and use the service listener in my "host" application to interact with the plugins.
The service listener is not invoked and I can't cast the returned Plugin object because the Plugin.class of my Host application is a different one compared to the Plugin.class thats inside the bundle.
Register the "PluginManager" in the host application and load this manager in the bundle.
In this case I'm again unable to cast the service class because of this class "duplication" issue.
I understand why the classes are "duplicated" but I'm not sure what to do about it.
My current setup:
plugin-api maven module: Provides Plugin interface
app maven module: Contains the app which embeds Apache Felix
dummy plugin has only a dependency on plugin-api
Is there a problem with the way my setup is structured? How can I access host services without creating a class mess? Should I create another module which is used to compile my plugin but it is excluded from the bundle and later provided on the host via FRAMEWORK_SYSTEMPACKAGES_EXTRA?
You should define your Plugin API (and all the non-VM based types that it uses) on the application side. If I would do this, I would make an API bundle (yes bundle) that exports these packages.
Make sure that all plugins not export the API or at least allow it to be imported.
In your application, before you start your Felix embedded framework, you get all the manifests of all JARs on the classpath with getResources("META-INF/MANIFEST.MF")and check for Export-Package. Then concatenate all these exported packages and set the OSGi Framework property org.osgi.framework.system.packages.extra to the joined string.
This will export any package on your classpath, so also your API bundle. Since the framework now exports these packages, your plugins will use the standard classpath as provider. Therefore, the API will have only one source and you will not get into class hell.
I creating a osgi bundle and using Apache-karaf as a osgi container. I am testing an application by putting logs and placing it in deploy folder to deploying the application. Everything works fine. while doing the testing the bundle id increases and after some iteration while deploying the application activate method is called two times. I've verified the same in new apache-karaf it works as expected that activate method is called only once.
Note: The bundle is application with some simple print statements.
1. Is this performance issue in Apache-karaf container for reaching more number of bundle ids or kind of caching problem in apache-karaf.
2. Is this problem with deploying the bundle in deploy folder instead of osgi:install?
There are some issues with the deploy folder. It is monitored by felix fileinstall. So the schedule when it checks the file system will determine how it reacts.
Using bundle:install is much more reliable and also works great for testing. Simple deploy your bundle to you local maven repo by using maven install. Then install it into karaf using the mvn:groupId/rtifactId/version url.
If you then change your bundle you can simply upload it using maven install again and do update . This will reload from your local maven repo.
If you use a maven -SNAPSHOT version (which you should) then you can also use bundle:watch *. Karaf will then look for changes in the local maven repo and automatically update the bundles.