I am currently converting my projects to Kotlin and I have an app with Room database using Java.
My Entity in Java
#Entity(tableName = "store")
#Fts4
public class Store {
#PrimaryKey
#ColumnInfo(name = "rowid")
private Long identification;
#NonNull
#ColumnInfo(name = "name")
private String name;
#ColumnInfo(name = "location")
private String location;
#ColumnInfo(name = "days_open")
private int daysOpen;
public Store(Long identification, #NonNull String name, String location, int daysOpen) {
this.identification = identification;
this.name = name;
this.location = location;
this.daysOpen = daysOpen
}
public Long getIdentification() {
return identification;
}
#NonNull
public String getName() {
return name;
}
public String getLocation() {
return location;
}
public int getDaysOpen() {
return daysOpen;
}
}
I convert it this way to Kotlin
#Entity(tableName = "store")
#Fts4
data class Store(
#PrimaryKey #ColumnInfo(name = "rowid")
val identification: Long?,
#ColumnInfo(name = "name")
val name: String,
#ColumnInfo(name = "location")
val location: String?
#ColumnInfo(name = "days_open")
val daysOpen: Int?
)
Now I am having this error
java.lang.IllegalStateException: Room cannot verify the data integrity. Looks like you've changed schema but forgot to update the version number. You can simply fix this by increasing the version number.
Do we really need to do migration in this? Or I am wrong converting things. I am using Room 2.3.0.
implementation "androidx.room:room-ktx:2.3.0"
kapt "androidx.room:room-compiler:2.3.0"
When I updated the database version, this is the error
java.lang.IllegalStateException: A migration from 1 to 2 was required but not found. Please provide the necessary Migration path via RoomDatabase.Builder.addMigration(Migration ...) or allow for destructive migrations via one of the RoomDatabase.Builder.fallbackToDestructiveMigration* methods.
I added this code to my database
val MIGRATION_1_2 = object : Migration(1, 2) {
override fun migrate(database: SupportSQLiteDatabase) {
database.execSQL("
// put changes here
")}
}
I don't know what to put inside of migrate function. Any idea?
Exception message seems to be quite clear. you need to update the version of your room database.
Go to the class that extends RoomDatabase and increment the value of version attribute in #Database annotation.
#Database(entities = [A::class, B::class], version = 2)
abstract class YourRoomDatabase: RoomDatabase()
Already got the solution.
The problem is my int in Java Entity is different w/ Int in Kotlin. Thus, I need to update my schema by Migration. My solution is referenced
here.
Related
What I have setup are two tables, one for a user created account, and the other that lets the user buy a product.
I have both tables set up like so
Customer Table
#PrimaryKey(autoGenerate = true)
private int custId;
#ColumnInfo(name = "user_name")
private String userName;
#ColumnInfo(name = "password")
private String password;
#ColumnInfo(name = "first_name")
private String firstName;
#ColumnInfo(name = "last_name")
private String lastName;
#ColumnInfo(name = "address")
private String address;
#ColumnInfo(name = "city")
private String city;
#ColumnInfo(name = "postal_code")
private String postalCode;
#ColumnInfo(name = "country")
private String country;
Phone Table
#PrimaryKey(autoGenerate = true)
private int productId;
private String phoneMake;
private String phoneModel;
private String phoneColor;
private String storageCapacity;
private Float price;
What I have set up are two foreign keys, one in each table. My last table is for ordering the phones, which requires using both Primary Keys from each table. What I feel like I need is a ForeignKey, similar in vein to the PrimaryKey already created. The problem is that I am unsure how to implement that into the program. Everything I try doing is not working. I have looked at the documentation, but nothing clicks. I hope you can help me with the correct screenshot. If more is needed let me know (This code is written in Java code)
If you simply want a Customer to have 1 phone, then you have have a single column (member variable) for the relationship that will store the phone's product id.
e.g.
private int mapToPhone; //<<<<< ADDED no need for #ColumnInfo the column name will be as per the variable name.
Obviously you set the value to an appropriate value.
To then get the Customer with the phone's details then you have a POJO that embeds the parent (Customer) using the #Embedded annotation has the child (Phone) using the #Relation annotation.
e.g. :-
class CustomerWithPhoneDetails {
#Embedded
Customer customer;
#Relation(
entity = Phone.class,
parentColumn = "mapToPhone",
entityColumn = "productId"
)
Phone phoneDetails;
}
You can then have a method in the #Dao annotated interface/abstract class which queries the parent table BUT returns the POJO or list/array of the POJO e.g. :-
#Query("SELECT * FROM Customer")
abstract List<CustomerWithPhoneDetails> getAllCustomersWithPhoneDeytails();
Example
Based upon your code, and the additional example code along with an #Database annotated abstract class :-
#Database(entities = {Customer.class,Phone.class}, version = 1, exportSchema = false)
abstract class TheDatabase extends RoomDatabase {
abstract AllDao getAllDao();
private static volatile TheDatabase instance = null;
public static TheDatabase getInstance(Context context) {
if (instance == null) {
instance = Room.databaseBuilder(context,TheDatabase.class,"the_database.db")
.allowMainThreadQueries()
.build();
}
return instance;
}
}
and an Activity e.g. :-
public class MainActivity extends AppCompatActivity {
TheDatabase db;
AllDao dao;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
db = TheDatabase.getInstance(this);
dao = db.getAllDao();
long phone01ProductId = dao.insert(new Phone("PhoneMaker001","Model001","Color001","100Mb",111.11F));
long phone02ProductId = dao.insert(new Phone("PhoneMaker002","Model002","Color002","200Mb",222.22F));
dao.insert(new Customer("c001","password001","firstname001","lastname001","address001","city001","country001","postcode001",(int) phone01ProductId));
dao.insert(new Customer("c002","password002","firstname002","lastname002","address002","city002","country002","postcode002",(int) phone02ProductId));
for(CustomerWithPhoneDetails cwpd: dao.getAllCustomersWithPhoneDeytails()) {
Log.d("DBINFO","Customer is " + cwpd.customer.getUserName() + " etc. Phone is " + cwpd.phoneDetails.getProductId() + " etc." );
}
}
}
Note that suitable constructors have been coded in both the Phone and Customer class (default/empty constructor and one, annotated with #Ignore annotation that allows all values bar the id to be passed as used in the example below)
Note that ideally long rather than int should be used for the id columns.
Results
The Log :-
D/DBINFO: Customer is c001 etc. Phone is 1 etc.
D/DBINFO: Customer is c002 etc. Phone is 2 etc.
App Inspection :-
and :-
I am trying to integrate OptaPlanner in my project. I am working with Spring jpa, maven and mysql database.
I have implemented the dependencies on my maven file, so I can use the annotations of OptaPlanner, but I don't know how to use it. I have been reading the documentation and examples but i still don't know how to use it.
I have to assign recipes and an user to a class called FoodList. Each object of FoodList has id, 2 enums, the recipe, the user and a Date, i show:
FoodList class:
#PlanningEntity()
#Entity
public class ListaComida {
#Id
#GeneratedValue(strategy = GenerationType.IDENTITY)
private Long id;
#Enumerated(EnumType.STRING)
private Comida comida;
#Enumerated(EnumType.STRING)
private Plato plato;
#PlanningVariable()
#ManyToOne
private Receta receta;
#PlanningVariable()
#ManyToOne
private Usuario usuario;
#Column(nullable = false)
private LocalDate fecha;
...
}
#PlanningSolution // OptaPlanner annotation
#TypeDef(defaultForType = HardSoftScore.class, typeClass = HardSoftScoreHibernateType.class) // Hibernate annotation
public class ListaComidaSolution {
#Columns(columns = {#Column(name = "hardScore"), #Column(name = "softScore")})
private HardSoftScore score;
#PlanningScore
public HardSoftScore getScore() {
return score;
}
public void setScore(HardSoftScore score) {
this.score = score;
}
}
<!-- Score configuration -->
<scoreDirectorFactory>
<easyScoreCalculatorClass>src/main/java/es.uca.AutomaticFoodList/GenerarComidaEasyScoreCalculator</easyScoreCalculatorClass>
<!--<scoreDrl>org/optaplanner/examples/cloudbalancing/solver/cloudBalancingScoreRules.drl</scoreDrl>-->
</scoreDirectorFactory>
<!-- Optimization algorithms configuration -->
<termination>
<secondsSpentLimit>30</secondsSpentLimit>
</termination>
public class GenerarComidaEasyScoreCalculator implements EasyScoreCalculator<ListaComidaSolution> {
public HardSoftScore calculateScore(ListaComidaSolution listaComidaSolution){
int hardScore = 0, softScore = 0;
return HardSoftScore.of(hardScore, softScore);
}
}
This class is not implemented, but I think I have to do it.
public static void generarListaComida(){
//SolverFactory<CloudBalance> solverFactory = SolverFactory.createFromXmlResource(
// "org/optaplanner/examples/cloudbalancing/solver/cloudBalancingSolverConfig.xml");
//Solver<CloudBalance> solver = solverFactory.buildSolver();
// Load a problem with 400 computers and 1200 processes
//CloudBalance unsolvedCloudBalance = new CloudBalancingGenerator().createCloudBalance(400, 1200);
// Solve the problem
//CloudBalance solvedCloudBalance = solver.solve(unsolvedCloudBalance);
// Display the result
//System.out.println("\nSolved cloudBalance with 400 computers and 1200 processes:\n"
// + toDisplayString(solvedCloudBalance));
}
Is this all classes and files I need to implement this in my project?, or I have to implement more classes?
On https://www.optaplanner.org/ you can download an executable demo. However it is not just an executable demo but also contains the source code of the examples ( in examples/source folder). There you can see how optaplanner is used in the example applications, and you can do the same in your application.
A good starting point is also https://docs.optaplanner.org/7.36.0.Final/optaplanner-docs/html_single/index.html#plannerConfiguration chapter 4 ff.
I'm using Google Gson to de/serialize my app database to/from JSON. The app uses Room database with a custom class which has more than 20 fields, so for semplicity let's consider an example one. I recently changed the type of a field from String to JSON; Room managed the migration fine - just by creating a converter class and adding a void version migration - but I'm not sure how to correclty handle Gson.
Here is the object class, respectively the old and the new one:
Old exClass.java
#Entity(tableName = "example_class")
public class exClass {
#PrimaryKey(autoGenerate = true)
public int uid;
#ColumnInfo(name = "name")
public String name;
#ColumnInfo(name = "json")
public String json;
}
New exClass.java
#Entity(tableName = "example_class")
#TypeConverters({Converters.class})
public class exClass {
#PrimaryKey(autoGenerate = true)
public int uid;
#ColumnInfo(name = "name")
public String name;
#ColumnInfo(name = "json")
public JSONObject json;
}
Now when trying to restore old database backups (created with the old version of the class) Gson throws an exception, because it's expecting a JSON object for the field "json" but finds a String.
Is there a way to override the type Gson expects without having to create a custom JsonSerializer<exClass>?
Note that the JSONObject is still fine as far as Room is concerned: the converter converts "json" to a String before inserting it into the database and viceversa, so it's only a matter of having Gson correcly parse the "json" field.
Thanks in advance
I try to save "BestandItem", which is made out of "ScanItem" and a Calendar with Ormlite. After I restart my app (for Android with Android Studio, which I write with Java) the content of ScanItem is gone, but only if it is inside of a Bestanditem.
This is my ScanItem:
#DatabaseTable(tableName = "scanItem")
public class ScanItem{
#DatabaseField (generatedId = true)
private int id;
#DatabaseField
private String barcode;
#DatabaseField
private String name;
public ScanItem(String barcode, String name) {
this.barcode = barcode;
this.name = name;
}
public ScanItem(){}
And this is my BestandItem:
#DatabaseTable (tableName = "bestandItem")
public class BestandItem {
#DatabaseField (generatedId = true)
private int id;
#DatabaseField (foreign = true, foreignAutoCreate = true)
private ScanItem scanItem;
#DatabaseField (dataType = DataType.SERIALIZABLE)
private Calendar ablaufDatum;
public BestandItem() { }
public BestandItem(ScanItem scanItem, Calendar ablaufDatum) {
this.scanItem = scanItem;
this.ablaufDatum = ablaufDatum;
}
Some of the things I have tried:
- Ormlite Documentation
- First Stackoverflow Answer
- Second Stackoverflow Answer
For more code see my github project: Github SmartFridge
My Ormlite Database has a UtilConfigClass and a always updated config.txt.
What did I do wrong here? Why dosn't save the ScanItem right?
After some checks, I can say that the other methods work just fine. (only after the lost of the ScanItem, I get NullPointerException). My conclusion is, that the problem is the constructor of the BestandItem.
I think I did something wrong with generatedID and/or foreignAutoCreate, but I dont really understand how do use it properly.
Also what exactly foreignAutoRefresh does.
I have tried to change the ID and generatedId around, because I think there lies the problem.
After some tests and countless trys, I now have a solution.
Because the barcode inside of ScanItem is unique, I could use it for a ID, instead of a generated id.
And with an existing ID, the programm now works.
I have a flink project that will be inserting data in a cassandra table as a batch job. I already have a flink stream project where it is writing a pojo to the same cassandra table, but cassandraOutputFormat needs the data as a Tuple (hope that is changed to accept pojos like CassandraSink does at some point). So here is the pojo that I have that:
#Table(keyspace="mykeyspace", name="mytablename")
public class AlphaGroupingObject implements Serializable {
#Column(name = "jobId")
private String jobId;
#Column(name = "datalist")
#Frozen("list<frozen<dataobj>")
private List<CustomDataObj> dataobjs;
#Column(name = "userid")
private String userid;
//Getters and Setters
}
And the dataset of tuple I am making from this pojo:
DataSet<Tuple3<String, List<CustomDataObj>, String>> outputDataSet = listOfAlphaGroupingObject.map(new AlphaGroupingObjectToTuple3Mapper());
And here is the line that triggers the output as well:
outputDataSet.output(new CassandraOutputFormat<>("INSERT INTO mykeyspace.mytablename (jobid, datalist, userid) VALUES (?,?,?);", clusterThatWasBuilt));
Now the issue that I have is when I try to run this, I get this error when it tries to output it to the cassandra table:
Caused by: com.datastax.driver.core.exceptions.CodecNotFoundException:
Codec not found for requested operation: [frozen<mykeyspace.dataobj> <-> flink.custom.data.CustomDataObj]
So I know when it was a pojo, I just had to add the #Frozen annotation to the field, but I don't know how to do that for a tuple. What is the best/proper way to fix this? Or am I doing something unnecessary because there is actually a way to send pojos through the cassandraOutputFormat I just haven't found?
Thanks for any and all help in advance!
EDIT:
Here is the code for the CustomDataObj class too:
#UDT(name="dataobj", keyspace = "mykeyspace")
public class CustomDataObj implements Serializable {
#Field(name = "userid")
private String userId;
#Field(name = "groupid")
private String groupId;
#Field(name = "valuetext")
private String valueText;
#Field(name = "comments")
private String comments;
//Getters and setters
}
EDIT 2
Including the table schema in cassandra that the CustomDataObj is tied to and the mytablename schema.
CREATE TYPE mykeyspace.dataobj (
userid text,
groupid text,
valuetext text,
comments text
);
CREATE TABLE mykeyspace.mytablename (
jobid text,
datalist list<frozen<dataobj>>,
userid text,
PRIMARY KEY (jobid, userid)
);
Add UDT Annotation on CustomDataObj class
#UDT(name = "dataobj")
public class CustomDataObj {
//......
}
Edited
Change jobid Annotation to #Column(name = "jobid") and dataobjs Frozen Annotation to #Frozen
#Table(keyspace="mykeyspace", name="mytablename")
public class AlphaGroupingObject implements Serializable {
#Column(name = "jobid")
private String jobId;
#Column(name = "datalist")
#Frozen
private List<CustomDataObj> dataobjs;
#Column(name = "userid")
private String userid;
//Getters and Setters
}
I believe I have found a better way than having to provide a tuple to the cassandraOutputFormat, but it technically still doesn't answer this question so I won't mark this as the answer. I ended up using cassandra's object mapper so I can just send the pojo to the table. Still need to validate that data got to the table successfully and that everything is working properly with the way it is implemented, but I felt this would help anyone who is facing a similar problem.
Here is the doc that outlines the solution: http://docs.datastax.com/en/developer/java-driver/2.1/manual/object_mapper/using/