I have an abstract class that a child class extends. My abstract class has an #Activate method, so does the child class. When OSGi creates my service, it invokes the child class activate method but never the abstract class's activate. Is there any way to force the abstract class's activate to be called by OSGi rather than having the child class manually call the parent activate method?
Here is some code to help elaborate on what I am asking.
#Component(componentAbstract=true, inherit=true)
#Service(value=ISomeInterface)
public abstract class AbstractHello implements ISomeInterface{
#Activate
public void activate(){
System.out.print("Hello ");
}
}
#Component
#Service(Value=ISomeInterface)
public class World extends AbstractHello{
#Activate
public void activate(){
System.out.println("World!");
}
}
The result of the code above would be "World!", rather than "Hello World!".
Initially I thought maybe the child activate method name was clobbering the abstract activate method of the same name. The result is the same even if the abstract class's activate method is given a unique name. Is there any way to have OSGi call the abstract class's activate method for me?
The DS annotation processors only look at the concrete class decorated with #Component. Super classes are not examined. Since the annotation processing is done at build time, super types may come from imported packages which are not chosen until runtime.
Also, the annotation processor generates component description XML from the annotations. So there can only be one activate="methodName" attribute in the XML. If you need the superclass' method called, then you need to call it from the subclass' method.
This has nothing to do with Apache Felix and OSGi, this is caused by poor understanding of Class Inheritance and Method Overriding in Java.
Your World class extends AbstractHello class and overrides its activate() method. If you want the AbstractHello.activate() method to be called then you must call it in
// Annotations excluded for readability.
public class World extends AbstractHello {
public void activate() {
super.activate();
System.out.println("World!");
}
}
OSGi can't help here.
UPDATE
Since the base class is abstract, and you don't have an instance of it, you can't call its method. Neither can OSGi container.
Related
I have a Spring bean (ChildBean extends Parent) which is extending an abstract class (Parent implements Runnable).
public abstract class Parent implements Runnable {
public final void run() {
// some code
}
public int overridenFunct() {
// some code
}
}
Child bean class variant which causes ClassCastException:
#Transactional
#Scope("prototype")
#Service("beanName")
public class ChildBean extends Parent {
#Override
public int overridenFunct() {
// some diff code
}
}
Everything works fine until I override public non-abstract method from parent class in child bean. After that a ClassCastException is thrown when I'm trying to create an instance of that bean.
Parent p = (Parent) appContext.getBean("beanName");
Bean object returned by getBean() is a ChildBean class instance (checked with debugger). Why does casting ChildBean object to its abstract parent class Parent not work?
So, without an overridenFunct() implemented in ChildBean everything works fine.
Could someone please tell what is the problem here?
UPDATE:
Changing method overridingFunct() to protected fixes the issue. But what if I need to override a public method? Is that allowed? I'm using Spring 3.2.8
UPDATE2:
Well, I didn't get to the point why overriding public method in abstract parent causes ClassCastException. As the resolution I did the following: created an interface with all public methods with common logic, an abstract class, which implements that interface and all "common" methods. Then all the child beans are extended from that abstract class, implementing its specific logic.
For anyone that may encounter this error, the following may prove to be useful in debugging this. First and foremost, the problem can be caused by the ClassLoader loading two copies of a particular class due to dependency overinclusion.
Supply the following option to your JVM via IDE or via
java -verbose:class {rest of your args / options}
Then, monitor the console output for the particular Parent class. A chance exists that the class has made it into the ClassLoader twice, perhaps by including a particular dependency more than once. Pay particular attention to the time when the bean is retrieved from lookup.
I was able to solve an issue on 4/22/2022 by using the above strategy to track down an issue in our Gradle build script that caused extra files to make their way into a WAR.
The Problem with your code is, that appContext.getBean("beanName") does not return an object that inherits from the class Parent.
A common mistake regarding classes with names like Parent is a wrong import.
Check if you are importing from the correct package.
If this does not fix the issue, make sure that appContext.getBean("beanName") returns the object you think it does.
It might return a Bean Object, that does not inherit from the Parent class.
The context also might not even contain your ChildBean object yet. Make sure it is added to it beforehand.
I have an interface method in a library which is being called by a method in the same library. The implementations are in the applications which include the library. The implementations are different for every application. In order to call the interface method, the calling method must instantiate the interface with its implemented class. But since the calling method is in the library, it has no access to the classes in the applications. The calling method is started by a background service and not by the application.
The interface in the library:
public interface InterfaceA {
void methodA();
}
The class in the application which implements the interface:
public class ClassA implements InterfaceA {
#Override
public void methodA() {
// do something
}
}
The method in the library which calls the interface method:
public void callInterface() {
InterfaceA ia;
ia.methodA(); // how to get this to work?
}
How do I call the interface method from the library without any access to the interface implementations in the applications? I cannot instantiate the interface from my library as the implementation classes are in the application which the library has no access to.
You don't need to do anything to get it to work if you have an instance of the interface:
public void callInterface(InterfaceA ia) {
ia.methodA();
}
And then add this parameter to the library methods which call this method, and so on. When applications call this method, they can pass their implementations.
Or if you really need to instantiate the interface inside the method, add another interface:
public void callInterface(Supplier<InterfaceA> iaSupplier) {
InterfaceA ia = iaSupplier.get();
ia.methodA();
}
See https://docs.oracle.com/javase/8/docs/api/java/util/function/Supplier.html.
The calling method is started by a background service and not by the application.
And what starts the background service? You need to add some way for the application to pass the implementation to it.
Alternately, you could use SPI (https://docs.oracle.com/javase/tutorial/ext/basics/spi.html) but it may be overkill.
public void callInterface() {
InterfaceA ia;
ia.methodA(); // how to get this to work?
}
Above should result in compilation problem.
The library function should take as parameter the Interface InterfaceA in method as shown in another answer Or If it is a stateless implementation then pass it while using the Class that contain this method.
What I know till now:
Instance of Servlet is first created by container via reflection and no argument constructor gets used.
Then parameterized init method gets called.
Also it is suggested that we should not create a constructor in servlet class as it is of no use. And I agree with that.
Lets say, I have created a no argument constructor in servlet class and from within that I am calling a parameterized constructor. My question is, will it be called by the container?
public class DemoServlet extends HttpServlet{
public DemoServlet() {
this(1);
}
public DemoServlet(int someParam) {
//Do something with parameter
}
}
Will DemoServlet() be called by the container and if we put some initializing stuff inside it, it will be executed? My guess is yes but it's just a guess based on my understanding.
This might be pretty useless, I am asking out of curiosity.
DemoServlet() will be called (as you are overriding the defined no-arg constructor in HttpServlet (which is a no-op constructor).
However the other DemoServlet(int arg) will not be called.
You are correct with your guess. DemoServlet() would be called by the container and any initialization code within it would be executed - even if that initialization is done through constructor-chaining And as a matter of fact this is a good way to have dependency injection and create a thread-safe servlet which is testable Typically it would be written this way
public class DemoServlet extends HttpServlet
{
private final someParam; //someParam is final once set cannot be changed
//default constructor called by the runtime.
public DemoServlet()
{
//constructor-chained to the paramaterized constructor
this(1);
}
//observe carefully that this paramaterized constructor has only
//package-level visibility. This is useful for being invoked through your
// unit and functional tests which would typically reside within the same
//package. Would also allow your test code to inject required values to
//verify behavior while testing.
DemoServlet(int someParam)
{
this.param = param
}
//... Other class code...
}
I'm trying to unit-test some classes that make use of a Singleton class whose constructor does some things I can't (and shouldn't) do from the unit-test environment. My ideal scenario would be to end up with the constructor completely suppressed and then stub out the other member methods that my test classes invoke. My problem is that I can't seem to get the constructor suppressed.
My understanding of a way to solve this would be something like the following:
public class MySingleton extends AbstractSingletonParent {
public final static MySingleton Only = new MySingleton();
private MySingleton(){
super(someVar); // I want the super-class constructor to not be called
//
//more code I want to avoid
}
public Object stubbedMethod() {}
}
public class ClassToBeTested {
public void SomeMethod(){
Object o = MySingleton.Only.stubbedMethod();
}
}
#RunWith(PowerMockRunner.class)
#PrepareForTest(MySingleton.class)
public class TestClass {
#Test
public void SomeTest() {
suppress(constructor(MySingleton.class));
mockStatic(MySingleton.class);
PowerMock.replay(MySingleton.class);
// invoke ClassToBeTested, etc
PowerMock.verify(MySingleton.class);
//make some assertions
}
}
Unfortunately during the createMock invocation, the MySingleton constructor is hit, and it still calls the super constructor.
Am I doing something silly? I found an example on the web doing almost exactly this, but it was using a deprecated suppressConstructor method. Despite the deprecation I tried that, too, to no avail...
Is what I'm trying to do possible? If so, what am I doing wrong?
*Edited version now works.
You need to annotate TestClass with the #PrepareForTest annotation so it has a chance to manipulate the bytecode of the singletons.
Also, the superclass ctor supression signature should include somevar's class; right now you're just suppressing the default ctor.
See the #PrepareForTest API docs. Here's a blog post with some more details as well.
FWIW, it's working for me:
#RunWith(PowerMockRunner.class)
#PrepareForTest({EvilBase.class, NicerSingleton.class})
public class TestEvil {
#Test
public void testEvil() {
suppress(constructor(EvilBase.class));
assertEquals(69, EvilBase.getInstance().theMethod());
}
#Test
public void testNice() {
suppress(constructor(EvilBase.class));
suppress(constructor(NicerSingleton.class));
assertEquals(42, NicerSingleton.getInstance().theMethod());
}
}
How about you set the instance field ('only' in your code) of your Singleton with an instance instantiated with the constructor you want (you can do all of this with the Reflection API or dp4j).
The motivating example of a dp4j publication discusses that.
I am not sure what is it that you are doing wrong. But on the design side, i can suggest you look into dependency injection i.e. DI.
For making your code testable, make use of DI. With DI you would pass the singleton class as an constructor argument to your test class. And now since you pass an argument, inside your test case you can create a custom implementation of the AbstractSingleton class and your test case should work fine.
With DI, your code will become more testable.
I've got a series of web actions I'm implementing in Seam to perform create, read, update, etc. operations. For my read/update/delete actions, I'd like to have individual action classes that all extend an abstract base class. I'd like to put the #Factory method in the abstract base class to retrieve the item that is to be acted upon. For example, I have this as the base class:
public abstract class BaseAction {
#In(required=false)#Out(required=false)
private MyItem item=null;
public MyItem getItem(){...}
public void setItem(...){...}
#Factory("item")
public void initItem(){...}
}
My subclasses would extend BaseAction, so that I don't have to repeat the logic to load the item that is to be viewed, deleted, updated, etc. However, when I start my application, Seam throws errors saying I have declared multiple #Factory's for the same object.
Is there any way around this? Is there any way to provide the #Factory in the base class without encoutnering these errors?
The problem you're encountering is that every Seam component needs a unique name - using your approach you'd have a component named "item" for each subclass.
I would do the following:
#Name( "action1" )
public class Action1 extends BaseAction
{
...
}
And in components.xml:
<factory name="action1Item" value="#{action1.item}" />