i have a class A
public abstract class A extends RealmObject{
}
public class B extends A {
}
error: A RealmClass annotated object must be derived from RealmObject
As far as I know, subclassing a subclass of RealmObject isnĀ“t supported by Realm in Android for the moment, but there's already a feature in their roadmap in order to improve this point ( https://github.com/realm/realm-java/issues/761 ).
At least, you can always duplicate fields between model classes, to simulate subclassing (not very elegant though).
Inheritance is not supported (answer based on v1.1.0), only interface implementation is.
To me I solved this issue by removing the dependency.
compile 'io.realm:realm-android:1.1.0'
and put the class path as:
buildscript {
repositories {
jcenter()
}
dependencies {
classpath "io.realm:realm-gradle-plugin:1.1.0"
}
}
and also put plugin as:
apply plugin: 'realm-android'
Further you can visit to Realm
Related
I am experiencing an issue regarding access of an abstract class.
For background, I have an abstract class in a common module. The abstract class is just setters/getters for some values.
abstract class BaseConfiguration {
abstract var key: String?
abstract var token: String?
}
The common module is used by another module that is a SDK. So, in gradle this common module is implemented.
Gradle
implementation project(":common")
Inside of the SDK module, there are public interfaces to allow the setting/getting of the values in the BaseConfiguration file.
The conflict I am running into is that for Kotlin everything works fine. I am able to create a project, then add the SDK as a dependency and then set values for the BaseConfiguration through the provided interfaces in the SDK.
However, when I try to use the Java language, I am receiving a "cannot access BaseConfiguration" error. The only thing that seems to resolve this issue is to update my SDK gradle to api.
Gradle Changes
api project(":common")
I have concerns with this change, because the common module is supposed to be internal and only used by the SDK. The app level should not have access to it. The app should only access the provided interfaces by the SDK.
Does anyone know why this is working perfectly with Kotlin, but not with Java? What am I doing wrong?
Further Notes:
When decompiling the Kotlin to Java for the BaseConfiguration class it shows that everything is public.
public abstract class BaseConfiguration {
#Nullable
public abstract String getKey();
public abstract void setKey(#Nullable String var1);
#Nullable
public abstract String getToken();
public abstract void setToken(#Nullable String var1);
}
By default, java's class visibility is package private. That means any file in the same package can access your abstract BaseConfiguration.
In kotlin, the default is public.
I suppose it is this visibility that is causing the issue. You should not use api for common module if the app module is not using it. Common's visibility should only be exposed to the SDK module as its direct neighbour. You are right to not expose the common module as a transitive dependency of app module.
Trying being explicit about the class dependency:
public abstract class BaseConfiguration {
abstract var key: String?
abstract var token: String?
}
I would take a look at the compiled java class by the kotlin compiler and see how it generates it.
I transferred the model classes of my Android app inlcuding realm.io functions into a library following the advices described in the realm.io Java docs section Sharing schemas.
I'm facing a NoSuchMethodError when my app (indirectly) calls realm.io methods which are in the the library.
java.lang.NoSuchMethodError: No static method copyOrUpdate(Lio/realm/Realm;Lmy/package/structure/MyModelClass;ZLjava/util/Map;)Lmy/package/structure/MyModelClass; in class Lio/realm/MyModelClassRealmProxy; or its super classes (declaration of 'io.realm.MyModelClassRealmProxy' appears in /data/user/0/my.name.space/files/.jrebel/load-dexes/bundle3/classes.zip)
at io.realm.MyModuleMediator.copyOrUpdate(MyModuleMediator.java:98)
at io.realm.Realm.copyOrUpdate(Realm.java:1164)
at io.realm.Realm.copyToRealmOrUpdate(Realm.java:757)
Here is how my library looks like:
build.gradle (project)
buildscript {
repositories {
jcenter()
}
dependencies {
classpath 'com.android.tools.build:gradle:2.1.2'
classpath 'io.realm:realm-gradle-plugin:1.0.1'
}
}
build.gradle (library module)
apply plugin: 'com.android.library'
apply plugin: 'realm-android'
...
buildTypes {
release {
minifyEnabled false
signingConfig signingConfigs.release
proguardFile getDefaultProguardFile('proguard-rules.pro')
}
}
...
proguard-rules.pro
I used this snippet as a template
-keep class io.realm.annotations.RealmModule
-keep #io.realm.annotations.RealmModule class *
-keep class io.realm.internal.Keep
-keep #io.realm.internal.Keep class * { *; }
-keep my.package.MyModelClass
-dontwarn javax.**
-dontwarn io.realm.**
MyModule.java
package my.package;
import io.realm.annotations.RealmModule;
#RealmModule(library = true, allClasses = true)
public class MyModule { }
MyRealm.java
package my.package;
import ...
public class MyRealm {
private final RealmConfiguration realmConfig;
public MyRealm(Context context) {
realmConfig = new RealmConfiguration.Builder(context)
.name("name.of.my.config")
.modules(new MyModule())
.build();
}
public Realm getRealm() {
return Realm.getInstance(realmConfig);
}
}
MyModelClass.class
package my.package;
import ...
public class MyModelClass extends RealmObject {
public void save(Context context) {
Realm realm = new MyRealm(context).getRealm();
realm.executeTransaction(new Realm.Transaction() {
#Override
public void execute(Realm bgRealm) {
bgRealm.copyToRealmOrUpdate(MyModelClass.this);
}
});
realm.close();
}
}
In my actual app I call something like this which is causing the Exception:
MyActivity.java
// ...
MyModelClass c = new MyModelClass();
c.save(context);
The code above was working well when everything was in the app project.
Am I missing something general?
Is there something more I need to consider regarding the proguard settings of the lib?
May JRebel cause this kind of problem?
I think you might not be using it correctly, I will suggest that you see a full working example like the one they reference are the end of that Sharing Schemas link. The link is to this repository full working example of realm lib and app. I would suggest that you take a look at the Zoo class from the library, the methods are the only thing exposed to the application as you can see in the app's activity here (line 148 and downwards).
I think this was just a confusion on your part, because from what I understood you are calling your library class and using it like it was a Realm instance for that activity context which is not the case. Hope this can steer you in the right path!
Basically you simply create an instance based on you configuration so you don't need to pass a context there.
This happens with JRebel, I would suggest you try with the latest version of it if the issue still persists try not using it.
I received the same error message as the original post.
java.lang.NoSuchMethodError: No static method copyOrUpdate(Lio/realm/Realm;Lmy/package/structure/MyModelClass;ZLjava/util/Map;)Lmy/package/structure/MyModelClass; in class Lio/realm/MyModelClassRealmProxy; or its super classes (declaration of 'io.realm.MyModelClassRealmProxy' appears in /data/user/0/my.name.space/files/.jrebel/load-dexes/bundle3/classes.zip)
at io.realm.MyModuleMediator.copyOrUpdate(MyModuleMediator.java:98)
at io.realm.Realm.copyOrUpdate(Realm.java:1164)
at io.realm.Realm.copyToRealmOrUpdate(Realm.java:757)
But my problem was a different one. I had one Realm module in a library and an additional Realm module in my app. Unfortunately both modules contain an entity with the same class name (but in different packages).
The code generation for realm creates only only one io.realm.<classname>RealmProxy class. Because of that I got the above mentioned exception.
The solution is quite simply: Just rename of the entities.
I'm having a project based on Dagger 2 which consists of two modules. The core module includes some interfaces and some classes that have member injections declared for these interfaces.
The actual implementations of these interfaces are included in the second module which is an Android project. So, naturally the provide methods for these are included in the Android project.
Dagger will complain during compilation about not knowing how to inject these in the core module.
Any thoughts on how to achieve this without using constructor injections?
In short, I just tried this, and it works. Be sure to check the exact error messages and make sure you are providing these interfaces and #Inject annotations are present.
There is probably just some wrong named interface or a missing annotation. Following up is a full sample using your described architecture that is compiling just fine. The issue you are currently experiencing is probably the one described in the last part of this post. If possible, you should go with the first solution though and just add those annotations.
The library
For reproducability this sample has minimalist models. First, the interface needed by my class in the library module:
public interface MyInterface {
}
Here is my class that needs that interface. Make sure to declare it in the constructor and provide the #Inject annotation!
#MyScope // be sure to add scopes in your class if you use constructor injection!
public class MyClassUsingMyInterface {
private MyInterface mMyInterface;
#Inject
public MyClassUsingMyInterface(MyInterface myInterface) {
mMyInterface = myInterface;
}
}
The idea is that the interface will be implemented by the app using MyClassUsingMyInterface and provided by dagger. The code is nicely decoupled, and my awesome library with not so many features is complete.
The application
Here need to supply the actual coupling. This means to get MyClassUsingMyInterface we have to make sure we can supply MyInterface. Let's start with the module supplying that:
#Module
public class MyModule {
#Provides
MyInterface providesMyInterface() {
return new MyInterface() {
// my super awesome implementation. MIT license applies.
};
}
}
And to actually use this, we provide a component that can inject into MyTestInjectedClass that is going to need MyClassUsingMyInterface.
#Component(modules = MyModule.class)
public interface MyComponent {
void inject(MyTestInjectedClass testClass);
}
Now we have a way to provide the requested interface. We declared that interface needed by the library class in a constructor marked with #Inject. Now I want a class that requires my awesome library class to use. And I want to inject it with dagger.
public class MyTestInjectedClass {
#Inject
MyClassUsingMyInterface mMyClassUsingMyInterface;
void onStart() {
DaggerMyComponent.create().inject(this);
}
}
Now we hit compile...and dagger will create all the factories needed.
Inject Libraries you can not modify
To just provide the full scale of dagger, this sample could also have been without actual access to the source code of the library. If there is no #Inject annotation dagger will have a hard time creating the object. Notice the missing annotation:
public class MyClassUsingMyInterface {
private MyInterface mMyInterface;
public MyClassUsingMyInterface(MyInterface myInterface) {
mMyInterface = myInterface;
}
}
In that case we have to manually provide the class. The module would be needed to be modified like the following:
#Module
public class MyModule {
#Provides
MyInterface providesMyInterface() {
return new MyInterface() {
};
}
#Provides
MyClassUsingMyInterface providesMyClass(MyInterface myInterface) {
return new MyClassUsingMyInterface(myInterface);
}
}
This introduces more code for us to write, but will make those classes available that you can not modify.
Let's say I have the following classes and dagger module
public class Base implements IBase {
private IDependency dependency; //IDependency is an interface
Base(IDependency dependency) {
this.dependency = dependency
}
}
public class SubClass extends Base implements ISubclass {
Base(IDependency dependency) {
super(dependency)
}
}
#Module
public class MyModule {
// Let's assume some other class use SubClass and requires this
#Provides
ISubclass providesSubclass(IDependency dependency) {
return new SubClass(dependency);
}
}
If I add a new parameter to Base constructor, I'll have to go to MyModule and modify provides method to include this new parameter (besides obviusly chaging Base and Subclass constuctors). It seems to me that using propery injection I don't have this problem since I'm not using any constructor.
My feeling is that I might be doing something wrong or I have some concept wrong. I prefer constructor injection over property injection but right now I have to add a constructor parameter to a base class used by 40 other classes and not only I have to modify those 40 classes constructors, I also have to modify modules to reflect new constructors parameters.
Am I missing something? Am I correct if I say that doing constructor injection I'll write much more less code and maintenance will be easier?
Yes, you are missing some awesome feature: You can still use constructor injection in this case! And you don't even have to write it yourself.
If all of the dependencies can be provided, dagger can and will create the object for you. Given that you can provide IDependency you just need to modify your code like the following:
public class SubClass extends Base implements ISubclass {
#Inject // Don't forget the annotation!
public Base(IDependency dependency) {
super(dependency)
}
}
#Module
public class MyModule {
#Provides
ISubclass providesSubclass(SubClass subclass) {
return subclass;
}
}
You provide the interface, yet you depend on your implementation to provide it. Dagger will resolve this, and you can merrily add as many parameters to the constructor as you like. (Apart from the obvious changes to the actual constructors you already pointed out)
Don't forget the #Inject annotation!
i have two independent projects Basic and Extension with following setup
Project A:
class Handler {
public void handle(){
...
}
}
Project B
import Handler; //from Proejct A
class SomeClass{
someMethod() {
handle(); //dependency to Project As class with handle method
}
}
So the problem is the dependecy to the handle method which exists at Project A but not at compile time on Project B.
The final step is to have build Project Extension as a jar and import it inside Project Basic.
Ofc the compiler will give me error when i build Project B since the handle is not known at compile time.
For this issue i need a solution:
Either: Tell java that the missing code (import class with handle method) will be there at running time.
Or maybe Dependency Injection due to a factory pattern.
I am known to the factory pattern, but i don't understand how it could help me in this situation.
Or another solution.
Can you help me?
Neither of these are valid Java - won't compile. The proper keyword is "class", not "Class".
You have to provide it at compile time once you get it right - you have no choice. No way around it.
Maybe you should look at the Java JDK and follow the example in the java.sql package: Interfaces. Connection, ResultSet, Statement, etc. are all interfaces so vendors can provide their own implementations. Users only deal with interfaces.
Your GenericHandler should be an interface that you provide to clients. They add their implementations and add their JAR file containing the custom implementation at runtime.
Basic interface that all extensions implement:
public interface GenericHandler {
void genericHandle();
}
Extension code:
import GenericHandler;
public class Extension implements GenericHandler {
public void genericHandle() {
// Do something useful here
}
}
The factory pattern works only if you provide a finite, closed set of implementations:
public class GenericHandlerFactory {
private final GenericHandlerFactory instance = new GenericHandlerFactory();
private GenericHandlerFactory() {}
public GenericHandler getInstance() { return this.instance; }
public GenericHandler createHandler(Class genericHandlerClass) {
GenericHandler result = null;
// Code to create the GenericHandler you want.
return result;
}
}
If users can extend your interface without your knowledge then a factory can't work; you have to stick to the JDBC example.