I'm developing a custom binary Gradle plugin, following the pattern shown here.
My extension, VersionInfoExtension is abstract with abstract methods, just like the BinaryRepositoryExtension in the documentation. My task is also abstract, with abstract methods for the parameters, like the LatestArtifactVersion in the documentation. When I test, I get the following error:
An exception occurred applying plugin request [id: 'com.example.version-info']
> Failed to apply plugin 'com.example.version-info'.
> Could not create an instance of type com.example.gradle.version.VersionInfoExtension.
> Could not generate a decorated class for type VersionInfoExtension.
> Cannot have abstract method VersionInfoExtension.jars().
What am I missing? Are there any good examples of making this work?
I'm using gradle 7.X and Kotlin.
The "Cannot have abstract method myPropertyName" error is reported when the method name is prefixed by "is":
abstract val isRegistered: Property<Boolean>
That was annoying to track down. The type doesn't seem to matter.
The solution was to remove "is" from the name.
The problem here seems to be the name of the abstract method.
All configurable methods must be bean-methods - this holds for both Extensions and Tasks.
So you should have used (assuming a java class):
abstract Property<String> getJars()
instead of
abstract Property<String> jars()
In addition to #Richard Sitze answer, in my case it was a custom setter function that was used in tests only (annotated with #TestOnly) was preventing generating the class.
I had declared serializable object as #Input
#get:Input
abstract val myPropertry : Property<MyObject>
#Test
fun setMyProperty(obj : MyObject){
//...
}
Somehow the name of the function interfered with generating the property setter along generation of the setter.
Related
I'm trying to mock a class derived from an Apache Beam generic class, and call a method of it using Mockito.
This is my real class:
public class MyClass extends DoFn<Entity, TableRow> {
public void processElement(ProcessContext c) {
// some business logic
c.output(new TableRow()) // c.output received a type defined in the derived class
}
}
And this is the test with the required mock:
DoFn<Entity, TableRow>.ProcessContext testContext = (DoFn<Entity, TableRow>.ProcessContext)mock(DoFn.ProcessContext.class);
when(textContext.output(any(TableRow.class))).thenReturn(42);
For some reason, I'm getting an error doing so, on the second line. That's the error:
Required type:
T
Provided:
void
reason: no instance(s) of type variable(s) T exist so that void conforms to T
Any solution for this?
Thanks!
An acceptable answer seems to be found in the comments, but I would also in general advise against writing tests by mocking out Beam DoFns and the like. Instead, the recommendation would be to either factor out the code in the body of the DoFn into something that can be more directly tested (if it's non-trivial) or, preferably, use the DoFn(s) in an actual pipeline and assert that it produces the correct results (see https://beam.apache.org/documentation/pipelines/test-your-pipeline/).
I'm using in a Java-Project a Framework (Ashley).
To use it I often have to write something like that:
entityEngine.getSystem(RenderingSystem.class).setProcessing(true);
While RenderingSystem is something Ive created, getSystem is an Part of the Framework itselve. Viewing the implementation of that method it looks like that:
/**
* Quick {#link EntitySystem} retrieval.
*/
#SuppressWarnings("unchecked")
public <T extends EntitySystem> T getSystem(Class<T> systemType) {
return systemManager.getSystem(systemType);
}
Now, even when I can compile and run the code with gradle, my IDE (IntelliJ) shows errors with the following warning:
What did I wrong, or how can I suppress these kind of warnings at least?
EDIT
Here is the definition of my class "RenderingSystem":
public class RenderingSystem extends EntitySystem {...}
Your method accept class of type T which is T extends EntitySystem>
Your RenderingSystem have to extend EntitySystem (or implement if EntitySystem is an interface) to be compatible to that method
While RenderingSystem is something Ive created, getSystem is an Part
of the Framework itselve. Viewing the implementation of that method it
looks like that:
If you pass a RenderingSystem class you defined in your own project and that you cannot pass it to the getSystem() methods, it means that RenderingSystem doesn't derive from EntitySystem.
You should probably extend/implement the EntitySystem class of the framework to make the getSystem() method usable with your own classes passed as parameter.
suppose you have moduleA and moduleB. ModuleA defines an interface (for instance for a Service) and ModuleB has a concrete class that implements the interface (provides the service).
Now if the interface has a default method and you invoke it on the class in moduleB (from another module) is this invocation supposed to be performed inside moduleA or moduleB?
Apparently it is from moduleA ... what's the rationale?
Example: suppose you have a code that does this:
InputStream is = this.getClass().getResourceAsStream(fullPath);
if this code lies in the implementation of the service in moduleB the stream will be opened. But if the code lies in the default method in moduleA then when the service is invoked on moduleB you will need to have an "open" resource in moduleB (so it seems that the invocation thinks it is from "outside" moduleB).
would like to read about the reason for that.
thanks
editing my question with an example.
Suppose you have in moduleA this code:
public interface PropertiesProvider {
public default Properties get(String domain) {
Class clazz =this.getClass();
System.out.println(" CLASS " +clazz);
InputStream is = clazz.getResourceAsStream(domain) ;
if (is != null) {
Properties props = new Properties();
try {
props.load(is);
return props;
} catch (IOException e) {
//log
}
}
return null;
}
}
and in moduleB
public class PropertiesProviderImpl implements PropertiesProvider {}
if you invoke the service from ModuleA the call is traced to come from class PropertiesProviderImpl finds the resource but does not load it if it is not "opened"
if you copy the code into PropertiesProviderImpl the calls is traced to that same class finds the ressource and loads it even when it is not "opened"
So my question is: why the difference since the call comes from the same class?
(the difference being that in one case the method is kind-of inherited from the default method in the interface)
Look at the documentation of the getResourceAsStream If this class is in a named Module then this method will attempt to find the resource in the module.
In the first case your code (in moduleA) sees the Type but cannot see the class which implements your Type, because it's in the moduleB. In the second case your code can see the class which "implements" the Type.
Look at the reference bellow, the most important sentences are:
In a modular setting the invocation of Class::forName will continue to work so long as the package containing the provider class is known to the context class loader. The invocation of the provider class’s constructor via the reflective newInstance method, however, will not work: The provider might be loaded from the class path, in which case it will be in the unnamed module, or it might be in some named module, but in either case the framework itself is in the java.xml module. That module only depends upon, and therefore reads, the base module, and so a provider class in any other module will be not be accessible to the framework.
[...]
instead, revise the reflection API simply to assume that any code that reflects upon some type is in a module that can read the module that defines that type.
[Long answer]: reflective-readability
A framework is a facility that uses reflection to load, inspect, and instantiate other classes at run time [...]
Given a class discovered at run time, a framework must be able to access one of its constructors in order to instantiate it. As things stand, however, that will usually not be the case.
The platform’s streaming XML parser, e.g., loads and instantiates the implementation of the XMLInputFactory service named by the system property javax.xml.stream.XMLInputFactory, if defined, in preference to any provider discoverable via the ServiceLoader class. Ignoring exception handling and security checks the code reads, roughly:
String providerName
= System.getProperty("javax.xml.stream.XMLInputFactory");
if (providerName != null) {
Class providerClass = Class.forName(providerName, false,
Thread.getContextClassLoader());
Object ob = providerClass.newInstance();
return (XMLInputFactory)ob;
}
// Otherwise use ServiceLoader
...
In a modular setting the invocation of Class::forName will continue to work so long as the package containing the provider class is known to the context class loader. The invocation of the provider class’s constructor via the reflective newInstance method, however, will not work: The provider might be loaded from the class path, in which case it will be in the unnamed module, or it might be in some named module, but in either case the framework itself is in the java.xml module. That module only depends upon, and therefore reads, the base module, and so a provider class in any other module will be not be accessible to the framework.
To make the provider class accessible to the framework we need to make the provider’s module readable by the framework’s module. We could mandate that every framework explicitly add the necessary readability edge to the module graph at run time, as in an earlier version of this document, but experience showed that approach to be cumbersome and a barrier to migration.
We therefore, instead, revise the reflection API simply to assume that any code that reflects upon some type is in a module that can read the module that defines that type. This enables the above example, and other code like it, to work without change. This approach does not weaken strong encapsulation: A public type must still be in an exported package in order to be accessed from outside its defining module, whether from compiled code or via reflection.
since we didn't understand precisely the previous response we carried some additional tests
in each test the resource file is not "opened"
1)
the code invoking clazz.getResouceAsStream is in default method of interface defining the service. The class implementing the interface does not defines any method.
-> this.getClass() yields the implementing class , tests fails to find resource
2)
we added this code in the default method
Object obj = clazz.getConstructor().newInstance();
and yes it fails
3) we changed the code so PropertiesProvider is abstract class and PropertiesProviderImpl inherits from it
same behaviour.
So yes it means that the same code will behave differently if you inherit from it or just invoke it directly.
This is worrying: it means the inner logic of the language is going to lead to convoluted byzantine behaviours (the reason why we dumped C++).
This is a Canonical Question because this is a common error with Dagger 2.
If your question was flagged as a duplicate please read this post carefully and make sure to understand what this error means and why it occured. If this post does not work for you make sure to include where and how you provide the mentioned classes and include the full error message in your question like the one here.
I tried to use a dependency with Dagger 2, but I receive the following error when I try to compile my project:
error: com.example.MyDependency cannot be provided without an #Inject constructor or from an #Provides-annotated method.
com.example.MyDependency is provided at
com.example.MyComponent.myDependency()
What does this mean and how can I fix it?
I have a component and tried to provide a dependency. My basic setup looks like this:
// this is the dependency I try to use
class MyDependency {}
#Component
interface MyComponent {
// I want to make it accessible to be used with my component
MyDependency myDependency();
}
tl;dr You forgot to either add an #Inject to your constructor so that Dagger can use Constructor Injection to provide the object, or you need some method in one of your Modules that creates or binds the object.
What's going on?
Have a good look at the error message: It states that you try to request a dependency but Dagger has no way to provide or create it. It simply does not know how to, because it cannot be provided without an #Inject constructor or from an #Provides-annotated method.
A close look at the error message shows the class (a) that you are trying to provide and the component (b) that needs it.
com.example.MyDependency (a) is provided at
com.example.MyComponent.myDependency() (b)
You have to make sure that (b) can create or provide (a) to fix your issue.
It looks a bit more complex if you tried to inject your dependency somewhere else, but you can still see the full stack of events—in this case a constructor injection missing a dependency. The class (a) that you are trying to provide and the location (b) where Dagger tried injecting it. It also tells you where that dependent class was created (c) and again the component (d) that failed providing (a).
com.example.MyDependency cannot be provided without an #Inject constructor or from an #Provides-annotated method.
com.example.MyDependency (a) is injected at
com.example.DependentClass.(dependency) (b)
com.example.DependentClass is provided at (c)
com.example.MyComponent.myDependency() (d)
The same applies here: Make sure that (d) knows how to provide (a) and you're good to go.
How do I fix this?
Have a look at the error as shown above. Make sure you understand where it occured and what you are trying to inject. Then tell Dagger how to provide your object.
an #Inject constructor
As the error states, you try to use MyDependency but MyComponent does not know how to do that. If we have a look at the example it becomes clear why:
class MyDependency {}
The class has no #Inject annotated constructor! And there is no other module in the component, so there is nothing Dagger could do.
If you want to use constructor injection you can just add an #Inject annotated constructor and are done. Dagger will see this constructor and know how to create your class.
class MyDependency {
#Inject
MyDependency() { /**/ }
}
That is all you have to do when you can make use of constructor injection.
from an #Provides-annotated method
The error message states a second option, which allows you to provide an object if you don't want—or can't—use constructor injection. You can also add a #Provides annotated method to a module and add this module to your component.
#Module
class MyModule {
#Provides
MyDependency provideMyDependency() {
return new MyDependency();
}
}
#Component(modules = MyModule.class)
interface MyComponent {
MyDependency myDependency();
}
This way Dagger can use your module to create and provide your dependency. It is a little bit more boilerplate than using Constructor Injection, but you will have to use Modules for everything that needs further setup or that does not have an annotated constructor, e.g. third party libraries like Retrofit, OkHttp, or Gson.
There are also other ways to provide a dependency from a component. A #SubComponent has access to its parents dependencies, and a component dependency can expose some of its dependencies to its dependent components. But at some point everything Dagger provides needs to either have an #Inject constructor or a Module providing it.
But I did add MyDependency!
Pay close attention to the details. You probably are using an interface when you are only providing the implementation, or try to use a parent class when Dagger only knows about the subclass.
Maybe you added a custom #Qualifier or used #Named("typeA") with it. To Dagger this is a completely different object! Double check that you actually provide and request the same dependency.
Read the error and make sure that you either have an #Inject annotated constructor, a module that has a #Provides method that provides that type, or a parent component that does.
What if I want to provide an implementation for my interface?
A simple example like the following shows how one class extends another:
class MyDependency extends MyBaseDependency {
#Inject MyDependency() { super(); }
}
This will inform Dagger about MyDependency, but not about MyBaseDependency.
If you have one class implementing an interface or extending a super class you have to declare that. If you provide MyDependency this does not mean that Dagger can provide MyBaseDependency. You can use #Binds to tell Dagger about your implementation and provide it when the super class is required.
#Module
interface MyModule {
#Binds
MyBaseDependency provideMyBaseDependency(MyDependency implementation);
}
I am working with SpringData's Neo4j graph DB hello-worlds example and I ran across the following code in WorldRepositoriesImpl.java...
#Autowired private WorldRepository worldRepository;
Furthermore, WorldRepository is defined as...
public interface WorldRepository extends MyWorldRepository,
GraphRepository<World>,
NamedIndexRepository<World>
{/* no method defined here */}
Now the odd part, no class that I can find actually implements WorldRepository.So, a few questions...
How is this possible? Where is this documented? Is there a way to make this a bit more explicit (less mysterious)?
Running the code with a debugger attached shows that the worldRepository instance wired up by Spring is a proxy object created at runtime.
Looking at the pom.xml and the dependencies included, it looks like the spring-neo4j library bundles in some Aspects that create this implementation class at runtime.
In other words, there is no implementation of this interface declared in the source code - but one is created at runtime with AspectJ and other tools.