I have an app with several #MappedSuperClasses. Out of one of them I need to write a csv with columns in a very particular order stablished by the client.
Doing a Entity.class.getDeclaredFields() used to be enough to retrieve and write the columns in the right order before we had superclasses, but now, even if I use a custom solution to iterate through the superclasses's fields the order is incorrect, so I resorted to using a DTO Entity which returns the right order when calling getDeclaredFields().
The problems come when I try to retrieve the values present in the entities related, we used to do something like:
Object value = getLineFromField(field, line);
Where getLineFromField() method would be like:
private Object getLineFromField(Field field, Entity line) {
Object value = null;
try {
value = field.get(line);
} catch (Exception e) {
LOG.info("There is no value. Adding a WhiteSpace to the Column Value");
}
return value;
}
The problem appears in the field.get(line), this method from the Field library will always return a null value
Any experience out there doing a similar mapping?
Just trying to avoid writing a super-ugly 100-liner switch case in the codebase...
EDIT to add internal exception I get from the Field library: UnsafeObjectFieldAccessorImpl
Related
I have a problem with one functionality in my spring app. I have 2 tables in the same database, both contains the same type of data (id,title,description and date). And I can get the data from one table but don't know how to insert into 2nd table.
In my #Service layer i can get the data from table A. But dont know how to convert into another class object (both classes contain the samne data)
Injected JpaRepositories
private TasksRepository theTasksRepository;
private TasksRepositoryArchive theTasksRepositoryArchive;
And there's code to get the object from table A (TasksRepository - JpaRepository)
public Tasks findById(int theId) {
//Check if value is null or not null
Optional<Tasks> result = theTasksRepository.findById(theId);
Tasks theTask = null;
if (result.isPresent())
{
//if value is not null
theTask = result.get();
}
else
{
//if value is null
throw new RuntimeException("Task with given ID couldn't be found " +theId );
}
return theTask;
}
1) Define 2 entities, one for each table. To copy data, create an instance of the 2nd type and, copy properties, save. To copy properties there are many ways: you cann call each getter and setter manually, you can use some libraries like Dozer or MapStruct. Don't forget to set ID to null.
2) If you want to have an archive of changes, use libraries that help to implement it. For instance, consider using Enverse.
I have a REST API which accepts query parameters. The query parameters are valid if and only if at a time only one query parameter is passed and it is among the list of valid query parameters.
Currently my logic for this is:
I am collecting the query params in a map. And then checking it's size. If size > 1 function throwing an error. If that is not the case then iterating through the map and if find a parameter other than valid ones the function throwing an error.
For example:
if(queryParam.size()>1) {
throw new FailureResponse();
}
queryParam.forEach(e->{
String key = e.getKey();
if(!key.equalsIgnoreCase("p1") && !key.equalsIgnoreCase("p2")) {
throw new FailureResponse();
}
});
But I think in this way I am violating the SOLID design principle which says a class should be open for extension but closed for modification.
I also thought of creating a file and then reading the acceptable params from it but that would add to the response time of the API as it involves reading a file.
Is there some way I can keep and read the valid query-params and it does not violate the design principles?
You could maintain an Enum of valid params and extend the enums as and when applicable like
public enum QueryParams{
PARAM_1("param1"),
PARAM_2("param2"),
private String paramValue;
QueryParams(String paramName){
this.paramValue = paramValue();
}
public void getParamValue(){
return this.value;
}
}
and then you could iterate over the set of values of this enum to filter out invalid values
List<String> validParams = Arrays.asList(QueryParams.values()).stream().map(QueryParams::getParamValue).collect(Collectors.toList());
queryParams.removeAll(validParams);
if(queryParams.size()!=0) {
throw new FailureResponse();
}
}
This helps you maintain the API class without any changes, whenever a new parameter is added, just extend the enum and all the rest is automatically extended as it all depends upon the value in the enum.
This is my sample mapping in hibernate
class ApplnDoc {
AdmAppln admAppln;
// getters and setters
}
class AdmAppln {
Set<Student> student;
// getters and setters
}
class Student {
int id;
String registerNo;
AdmAppln admAppln;
// getters and setters
}
In ApplnDoc table we are storing images of all candidates. AdmAppln is for storing admission details, Student is for storing student details. Even if AdmAppln is having a Set of Student, only one record of Student will be present for a particular AdmAppln id (under one AdmAppln only one Student).
Now I want to write few data from to these tables, into an Excel file, whose records must be sorted in the order of registerNo (if it is present), otherwise using id of the Student. We are using XSSFWorkbook class under org.apache.poi.xssf.usermodel package for doing operations on Excel sheet. Here I found a way to sort the excel sheet, but I tried and found a way in code itself using Comparable interface.
This is what I did in ApplnDoc class
public int compareTo(ApplnDoc otherData) {
if(new ArrayList<Student>(this.admAppln.getStudents()).get(0).getRegisterNo() != null &&
!new ArrayList<Student>(this.admAppln.getStudents()).get(0).getRegisterNo().isEmpty() &&
new ArrayList<Student>(otherData.admAppln.getStudents()).get(0).getRegisterNo() != null &&
!new ArrayList<Student>(otherData.admAppln.getStudents()).get(0).getRegisterNo().isEmpty()) {
return new ArrayList<Student>(this.admAppln.getStudents()).get(0).getRegisterNo()
.compareTo
(new ArrayList<Student>(otherData.admAppln.getStudents()).get(0).getRegisterNo());
} else {
return new ArrayList<Student>(this.admAppln.getStudents()).get(0).getId() -
new ArrayList<Student>(otherData.admAppln.getStudents()).get(0).getId();
}
}
Since there is no get() method in Set interface the only way to get Student's registerNo from AdmAppln was to convert it to a list. Then I sorted the list and then it was iterated to generate the excel file.
Is the above mentioned comparison mechanism a proper one or is there a better way? Why I am asking this question because when the Hibernate session is closed and in my compareTo if I'm accessing the child table columns, then I will be getting Invocation exception.
There are some thing worth discussing here:
1-
Even if AdmAppln is having a Set of Student, only one record of
Student will be present for a particular AdmAppln
Why?
is this something you have no control over or is there any particular reason to keep a set where is not needed? (also im assuming a #OneToMany instead of a #OneToOne mapping)
2-
This lead to the child object beig lazy fetched (N.B this is an assumption since you didn't post relevant code about mappings or how you fetch the entity from db).
This means that you have to either switch to eager fetching in the entity (unrecommended) or specify it when fetching the entities
3-
Also please refactor that compareTo and use variables
public int compareTo(ApplnDoc otherData) {
Student thisStudent = new ArrayList<>(this.admAppln.getStudents()).get(0);
Student otherStudent = new ArrayList<>(otherData.admAppln.getStudents()).get(0);
if(thisStudent.getRegisterNo() != null &&
!thisStudent.getRegisterNo().isEmpty() &&
otherStudent.getRegisterNo() != null &&
!otherStudent.getRegisterNo().isEmpty()) {
return thisStudent.getRegisterNo().compareTo(otherStudent.getRegisterNo());
} else {
return thisStudent.getId() - otherStudent.getId();
}
}
While nothing wrong with that comparison mechanism (except the NullPointer if you have an empty Set of student) you should use the database ordering when querying.
If you still want to compare that way you just have to make sure you have everything you need fetched before closing the session.
You need to load the entire object tree before closing the session else you will get Exception. By the way you can always sort records with the query itself.
I want to know the best way to check variable type at runtime.
public Iterator<?> read(String entityName, String propertyName, Object propertyValue) {
String query = "select * from " + entityName + " where " + propertyName + "=";
try {
int value = Integer.parseInt((String)propertyValue);
query=query+value;
} catch (NumberFormatException e) {
// failed
}
try {
String value = (String)propertyValue;
query=query+"'"+value+"'";
} catch (ClassCastException e) {
// failed
}
try {
float value = Float.parseFloat((String)propertyValue);
query=query+value;
} catch (NumberFormatException e) {
// failed
}
//Creating JDBC connection and execute query
Iterator<Element> result=queryConn.execute();
return result;
}
I need to check the variable type is int, float or String during runtime. Is there any other best way to do this?
Or Do I need to write seperate method for each variable type?
try this code :
if(floatVariable instanceof Float){}
if(intVariable instanceof Integer){}
if(stringVariable instanceof String){}
There are many ways to handle this scenario.
Use function overloading for different data types
Use instanceof operator to determine data type
Try to cast property value in any numeric data type, if successfully castes then ignore single quotes otherwise apply single quotes
since you are getting object as input you can always check using instanceof keyword.And instead of using primitives try using classes like(Integer.class).And one more thing is you should use PreparedStatement always.Your code is prone to SqlInjection.
Is there any other best way to do this?
I would recommend that you name the columns you want to select in your actual query. If you take this approach, you can parse each column as the appropriate type without worrying about type casting issues. If, for example, the first column selected were an integer type, then you would just call Integer.parseInt() without worrying about having the wrong type.
And here is an argument why using SELECT * is an anti-pattern:
If you use SELECT * as your query, then we don't even know how many columns are being returned. To even take a guess at that, we would have to analyze how many columns your code seems to expect. But, what would happen if someone were to change the schema, thereby possibly changing the order in which the RDBMS returns columns? Then your entire application logic might have to change.
I am using common FieldSetMapper logic found through searches and in examples on StackOverflow and I have run into a situation which surprised me. Either it is a feature or a bug, but I thought I would present it here for review to see how others handle it.
Using Spring Batch, I have a pipe delimited file which has string and number values which may by optional depending on position. For example:
string|string|number|number|string
string||number||string
In your field set mapper class which implements FieldSetMapper, you usually do some mapping such as:
newThingy.setString1(fieldSet.readString("string1"));
newThingy.setString2(fieldSet.readString("string2"));
newThingy.setValue1(fieldSet.readInt("value1"));
newThingy.setValue2(fieldSet.readInt("value2"));
newThingy.setString3(fieldSet.readString("string3"));
During testing the code for line 1 above worked fine.
For line 2 with the blank values for string2 and value, a Java exception was thrown for the number but not the string:
Caused by: java.lang.NumberFormatException: Unparseable number:
at org.springframework.batch.item.file.transform.DefaultFieldSet.parseNumber(DefaultFieldSet.java:754)
at org.springframework.batch.item.file.transform.DefaultFieldSet.readInt(DefaultFieldSet.java:323)
at org.springframework.batch.item.file.transform.DefaultFieldSet.readInt(DefaultFieldSet.java:335)
at com.healthcloud.batch.mapper.MemberFieldSetMapper.mapFieldSet(MemberFieldSetMapper.java:31)
at com.healthcloud.batch.mapper.MemberFieldSetMapper.mapFieldSet(MemberFieldSetMapper.java:1)
I did some research in the DefaultFieldSetMapper.java class provided by Spring Batch which implements the FieldSet class to try and understand what is going on.
What I found is that the readAndTrim function called by readString returns null if the value read is blank
protected String readAndTrim(int index) {
String value = tokens[index];
if (value != null) {
return value.trim();
}
else {
return null;
}
}
... but when using readInt (and maybe others) we are returning an exception.
private Number parseNumber(String candidate) {
try {
return numberFormat.parse(candidate);
}
catch (ParseException e) {
throw new NumberFormatException("Unparseable number: " + candidate);
}
}
I do see where you can return a default value in some of the methods, but null is obviously not allowed. What I would expect is consistent behavior between all methods in FieldSet implementations which allow one to match the file to my database as the data is read. Blank values in delimited and fixed length files are fairly common.
If number based values cannot be properly handled, I will probably have to convert everything over to String as it is read and then go through the trouble to manual handle the conversion to the database, which obviously defeats the purpose of using Spring Batch.
Am I missing something that I should handle better? I can add more code if needed, I just felt this is commonly used and I could keep this short. Will edit as needed.
Edit: Add info on Unit Tests found for Spring Batch class
The comments in the test case state a default should be set instead, but why? I don't want a default. My database allows a null value in the Integer column. I would have to set the default to some arbitrary number which hopefully no one EVER sends, check for it before insert and then switch to null on insert. I still don't like this "feature."
#Test
public void testReadBlankInt() {
// Trying to parse a blank field as an integer, but without a default
// value should throw a NumberFormatException
try {
fieldSet.readInt(13);
fail();
}
catch (NumberFormatException ex) {
// expected
}
try {
fieldSet.readInt("BlankInput");
fail();
}
catch (NumberFormatException ex) {
// expected
}
}
Always sanity check your input/data. I'll usually throw together a Util class with all the parse/read/verification I need. Bare bones version below...
public static Integer getInteger(FieldSet fs, String key, Integer default) {
if(StringUtils.isNumeric(fs.readString(key))) {
return fs.readInt(key);
} else {
return default;
}
}