I'm creating a mobile application that will have users as defined below:
public class UserObject extends SugarRecord<UserObject>
{
public UUID Id;
public String Name;
public int DefaultSort;
public String Username;
public Date LastLogin;
public UserObject()
{
}
public UserObject(UUID id, String name, int defaultSort, String username, String password, Date lastLogin)
{
this.Id = id;
this.Name = name;
this.DefaultSort = defaultSort;
this.Username = username;
this.LastLogin = lastLogin;
}
}
These users will be retrieved from an API - the IDs are stored in the Database as uniqueidentifiers (or GUIDs in C#).
When I enter a new record into the users table, everything works just fine:
DateTime dt = new DateTime();
UUID newUID = UUID.randomUUID();
UserObject newUser = new UserObject(newUID, "David", 1, "userOne", "password", dt.toDate());
newUser.save();
However, when I try and retrieve the value back out I get an error:
Class cannot be read from Sqlite3 database. Please check the type of field Id(java.util.UUID)
Does Sugar ORM (or SQLite) simply not support UUIDs? If I try with the Joda DateTime, replacing the "LastLogin", the table just won't build at all so it looks as if it can create the fields, just not retrieve/store them...
You should not use var with name Id. Sugar record uses Id to store index values. Use anything except Id
public class UserObject extends SugarRecord{
public UUID uId;
public String Name;
public int DefaultSort;
public String Username;
public Date LastLogin;
public UserObject()
{
}
public UserObject(UUID uId, String name, int defaultSort, String username, String password, Date lastLogin)
{
this.uId= uId;
this.Name = name;
this.DefaultSort = defaultSort;
this.Username = username;
this.LastLogin = lastLogin;
}
}
from the v1.4 it is no longer need to use SugarRecord
Sugar ORM won't be able to build the date object again when retrieving data from the database. If you take a look at your database, it's probably storing strings (that's the case with me, and I'm using JODA).
In my case I overcame this issue by using strings instead of data objects. You will have to parse them after you retrieve the data from the database, but I believe that's the only option you have, since SQLite only support a few data types.
Related
I am trying to execute a custom select query in Springboot JPA,
public interface idnOauth2AccessTokenRepository extends JpaRepository<idnOauth2AccessToken, String>,
JpaSpecificationExecutor<idnOauth2AccessToken> {
#Query(value = "select IOCA.userName, IOCA.appName, IOAT.refreshToken, IOAT.timeCreated, IOAT.tokenScopeHash, IOAT.tokenState, IOAT.validityPeriod from idnOauth2AccessToken IOAT inner join idnOauthConsumerApps IOCA on IOCA.ID = IOAT.consumerKeyID where IOAT.tokenState='ACTIVE'")
List<userApplicationModel> getUserApplicationModel();
}
But when I execute I get an error of
org.springframework.core.convert.ConverterNotFoundException: No converter found capable of converting from type [org.springframework.data.jpa.repository.query.AbstractJpaQuery$TupleConverter$TupleBackedMap] to type [com.adl.egw.Model.user.userApplicationModel]
I tried different type of answers from the internet, but I nothing seems to work fine. I also tried implementing a new repository for userApplicationModel but didn't work.
Any answers or implementation which could help.
You are joining columns from different tables and then assigning to a different object. It does not work this way + the userApplicationModel doesn't seem managed entity. For such scenarios, you have to use projection(dto mapping). Take a look of the following Query:
#Query(value = "select new your.package.UserApplicationModelProjection(IOCA.userName, IOCA.appName, IOAT.refreshToken, IOAT.timeCreated, IOAT.tokenScopeHash, IOAT.tokenState, IOAT.validityPeriod)"
+ " from idnOauth2AccessToken IOAT inner join idnOauthConsumerApps IOCA on IOCA.ID = IOAT.consumerKeyID where IOAT.tokenState='ACTIVE'")
List<UserApplicationModelProjection> getUserApplicationModel();
And the class to map to:
public class UserApplicationModelProjection {
private String userName;
private String appName;
private String refreshToken
private OffsetDateTime timeCreated
private String tokenScopeHash;
private String tokenState; //mind the data type
private int validityPeriod; //update the data type
public UserApplicationModelProjection(String userName,
String appName,
String refreshToken,
OffsetDateTime timeCreated,
String tokenScopeHash,
String tokenState,
int validityPeriod)
{
this.userName = userName;
this.appName = appName;
this.refreshToken = refreshToken;
this.timeCreated = timeCreated;
this.tokenScopeHash = tokenScopeHash;
this.tokenState = tokenState;
this.validityPeriod = validityPeriod;
}
// Getters only
}
Check this for detailed explanation: https://vladmihalcea.com/the-best-way-to-map-a-projection-query-to-a-dto-with-jpa-and-hibernate/
I am trying to execute this query:
#Override
public UserInfo get(Long id) {
String sql = "SELECT * FROM users WHERE id = ? ";
List<UserInfo> list = jdbcTemplate.query(sql,new UserInfoMapper(),id);
return list.get(0);
}
but jdbc return empty list and I get exception at return line.
But if try to execute directly though the console it returns:
Query, Answer
Query was executed with id 1 and retured correct anwser;
But in method its returned this
I couldn't find any same questions so that may be point at my inattention to something. But I can't see any problem that may cause this. Thanks in advance;
Updated 1
Changing code to
#Override
public UserInfo get(Long id) {
String sql = "SELECT * FROM users WHERE id = ? ";
List<UserInfo> list = jdbcTemplate.query(sql, new Object[] {id},new UserInfoMapper());
return list.get(0);
}
resulted in same: result
Updated 2
#Override
public UserInfo mapRow(ResultSet resultSet, int i) throws SQLException {
UserInfo info = new UserInfo();
info.setId(resultSet.getLong("id"));
info.setFirstname(resultSet.getString("firstname"));
info.setMiddlename(resultSet.getString("middlename"));
info.setLastname(resultSet.getString("lastname"));
info.setUsername(resultSet.getString("username"));
info.setPassword(resultSet.getString("password"));
info.setEmail(resultSet.getString("email"));
info.setMobilephone(resultSet.getString("mobilephone"));
info.setPosition(resultSet.getString("position"));
return info;
}
public class UserInfo {
private Long id;
private String firstname;
private String middlename;
private String lastname;
private String username;
private String password;
private String email;
private String mobilephone;
private String position;
public UserInfo() {
}
}
Getter and setters for each field is there but I think there is no need to show them up.
Check user credentials that you are using to connect database from your application and the user credentials in console. And also check owner schema , table owner schema in your application.
I have a field which represents IP Address in my schema. I want to use the Binary Type to store the data.
The way I imagine this is if my ip is: 50.100.150.200 I will save it as [50,100,150,200] in a Byte array (the sequence sure matters but we can leave it out of the discussion in this question).
My question is how to filter by this column when I query? (String doesn't really fit the purpose)
For instance I want to run the following query:
SELECT * from table1 WHERE sourceip='50.100.150.200'
Here is a piece of code to demonstrate the problem:
Bean definition (for schema creation):
public static class MyBean1 implements Serializable {
private static final long serialVersionUID = 1L;
private int id;
private String name;
private byte[] description;
public MyBean1(int id, String name, String description) {
this.id = id;
this.name = name;
this.description = description.getBytes();
}
public int getId() {
return id;
}
public void setId(int id) {
this.id = id;
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
public byte[] getDescription() {
return description;
}
public void setDescription(byte[] description) {
this.description = description;
}
}
Demo code (I want to filter by the description):
List<MyBean1> newDebugData = new ArrayList<MyBean1>();
newDebugData.add(new MyBean1(1, "Arnold", "10.150.15.10"));
newDebugData.add(new MyBean1(1, "Bob", "10.150.15.11"));
newDebugData.add(new MyBean1(3, "Bob", "10.150.15.12"));
newDebugData.add(new MyBean1(3, "Bob", "10.150.15.13"));
newDebugData.add(new MyBean1(1, "Alice", "10.150.15.14"));
Dataset<Row> df2 = sqlContext.createDataFrame(newDebugData, MyBean1.class);
df2.createTempView("table1");
sqlContext.sql("select * from table1 where description='10.150.15.14'").show();
I am getting the error:
differing types in '(table1.`description` = CAST('10.150.15.14' AS DOUBLE))'
This isn't 100% an answer to your question, but I hope the pointer will help.
The following question is not about filtering, but selecting data from an array. selecting a range of elements in an array spark sql
It looks like a lot of information there, including some guidance on UDFs to query arrays using Spark SQL.
Hope this helps.
SPARK-21344 fixed the following issue in versions 2.0.3, 2.1.2, 2.2.1: BinaryType comparison does signed byte array comparison. So binary comparisons should work with those releases.
The JIRA has the following scala test code:
case class TestRecord(col0: Array[Byte])
def convertToBytes(i: Long): Array[Byte] = {
val bb = java.nio.ByteBuffer.allocate(8)
bb.putLong(i)
bb.array
}
val timestamp = 1498772083037L
val data = (timestamp to timestamp + 1000L).map(i => TestRecord(convertToBytes(i)))
val testDF = sc.parallelize(data).toDF
val filter1 = testDF.filter(col("col0") >= convertToBytes(timestamp)
&& col("col0") < convertToBytes(timestamp + 50L))
assert(filter1.count == 50)
I don't know what the equivalent Java code would be, but that should get you started.
I mentioned in a comment above on the question that we use LongType to store IPv4 addresses. We have a wrapper script to convert dotted decimal to long integer, and a Spark UDF to go the other way: long integer to dotted decimal. I assume LongType is faster for queries than BinaryType.
I am trying to create an application using springboot-java,
front end as html/css/javascript. I get the below error while doing a post to create an employee record. I am passing a join date which is causing the error.
failed - {"readyState":4,
"responseText":"{\"timestamp\":1515066928232,
\"status\":500,
\"error\":\"Internal Server Error\",
\"exception\":\"java.time.format.DateTimeParseException\",
\"message\":\"Text '01-17-2018' could not be parsed at index 0\",
\"path\":\"/employee/\"}",
"responseJSON":{"timestamp":1515066928232,
"status":500,
"error":"Internal Server Error",
"exception":"java.time.format.DateTimeParseException",
"message":"Text '01-17-2018' could not be parsed at index 0",
"path":"/employee/"},
"status":500,
"statusText":"error"}
Below is the code I use in java:
private static final long serialVersionUID = -7314008048174880868L;
private static final DateTimeFormatter DATE_FORMAT = DateTimeFormatter.ofPattern("MM-dd-yyyy");
private Integer organisation;
private String empName;
private String joinDate;
private String gender;
private String designation;
private String email;
public EmployeeDto(){
}
public EmployeeDto(Integer organisation, String empName, String joinDate, String gender,
String designation, String email) {
super();
this.organisation = organisation;
this.empName = empName;
this.joinDate = joinDate;
this.gender = gender;
this.designation = designation;
this.email = email;
}
public Employee buildEmployee(){
return new Employee(this.empName,LocalDate.parse(this.joinDate),this.gender,this.designation,this.email);
}
I use the below date converter:
#Converter(autoApply = true)
public class LocalDateAttributeConverter implements AttributeConverter<LocalDate, Date> {
#Override
public Date convertToDatabaseColumn(LocalDate locDate) {
return (locDate == null ? null : Date.valueOf(locDate));
}
#Override
public LocalDate convertToEntityAttribute(Date sqlDate) {
return (sqlDate == null ? null : sqlDate.toLocalDate());
}
}
I am unable to find the root cause....
In buildEmployee() I think you want LocalDate.parse(this.joinDate, DATE_FORMAT) instead of LocalDate.parse(this.joinDate).
One often posted link here on Stack Overflow is How to debug small programs. In that blog post the first tip is to turn on compiler warnings and read them carefully. Putting your code into my Eclipse produces the following warning:
The value of the field EmployeeDto.DATE_FORMAT is not used
This may cause you to wonder why DATE_FORMAT is not used and why you wanted to have it there in the first place. Which in turn could inspire you to use it as intended, which fixes your error.
I want to use Objectify to query Google Cloud Datastore. What is an appropriate way to find a record based on a known key-value pair? The record is in the database, I verified this by Google's Datastore viewer.
Here is my method stub, which triggers the NotFoundException:
#ApiMethod(name="getUser")
public User getUser() throws NotFoundException {
String filterKey = "googleId";
String filterVal = "jochen.bauer#gmail.com";
User user = OfyService.ofy().load().type(User.class).filter(filterKey, filterVal).first().now();
if (user == null) {
throw new NotFoundException("User Record does not exist");
}
return user;
}
Here is the User class:
#Entity
public class User {
#Id
Long id;
private HealthVault healthVault;
private String googleId;
public User(String googleId){
this.googleId = googleId;
this.healthVault = new HealthVault();
}
public Long getId() {
return id;
}
public void setId(Long id) {
this.id = id;
}
public HealthVault getHealthVault() {
return healthVault;
}
public void setHealthVault(HealthVault healthVault) {
this.healthVault = healthVault;
}
public String getGoogleId() {
return googleId;
}
public void setGoogleId(String googleId) {
this.googleId = googleId;
}
}
I think it fails because of transaction. You need to make a transctionless call like:
User user = OfyService.ofy().transactionless().load().type(User.class).filter(filterKey, filterVal).first().now();
More info about transactions on App Engine:
https://cloud.google.com/appengine/docs/java/datastore/transactions
https://github.com/objectify/objectify/wiki/Transactions
EDIT
Your object needs #Index annotation. It will add field to datastore index. Only properties that are in the index can be searchable. Filter method is one of them.
#Id
Long id;
#Index
private HealthVault healthVault;
#Index
private String googleId;
P.S. delete your object with googleId jochen.bauer#gmail.com and write it again to database after you updated your entity. And objectify will find it.
First add #Index in your fields model. I didn't see filterVal as an email in your model. Even so, to get the entity based in your filterVal assuming that is googleId is the field of your entity.
User user = OfyService.ofy().load().type(User.class).filter("googleId", filterVal).now();
And so if your filterKey is the id of your entity.
User user = OfyService.ofy().load().key(Key.create(User.class, filterKey)).now();