How to print a Function<A,B>? - java

I would like to create a central logger class that logs messages like:
Couldn't find a record of type MyClass when searching for field
MyField
Currently, I have this piece of code:
public static <T, A, B> String logNotFound(final Class<T> type, String field) {
return String.format(DATA_NOT_FOUND, type.getSimpleName(), field);
}
And I call it like:
Optional<Person> person = findPersonByLastName("Smith");
if (person.isEmpty()) logNotFound(Person.class, "lastName");
However, I don't quite like passing a string to the field name. I would like to call the log method as
logNotFound(Person.class, Person::getLastName)
passing a Function<A,B> as parameter. I expect a message like
Couldn't find a record of type Person when searching for field > Person::getLastName
Is there a way to do this?

seems to be XY problem: you actually need to log errors in convenient way, but do not know how to refer class fields in code... There are two options:
lombok: #FieldNameConstants - there are some issues
implement annotation processor which will generate metamodel for your classes (or if you are on HBN you may use existing one: https://vladmihalcea.com/jpa-criteria-metamodel/)

It is possible, use (or analyze the source code and implement it in your way) safety-mirror library:
assertEquals("isEmpty", Fun.getName(String::isEmpty));
Here you read more details - Printing debug info on errors with java 8 lambda expressions

Related

Using Lombok's SuperBuilder with Hibernate Validator (jakarta.validation.x) annotation on a container type leads to "Type mismatch"

I have a class ActivitiesModel which uses Lombok's SuperBuilder.
import jakarta.validation.NotBlank;
// other imports and statements omitted for brevity.
#Data
#SuperBuilder
#NoArgsConstructor
public class ActivitiesModel {
public static final String ACTIVITIES_NOT_NULL_MESSAGE = "Activities cannot be null";
public static final String ACTIVITY_NOT_BLANK_MESSAGE = "Activity cannot be blank";
#NotNull(message = ACTIVITIES_NOT_NULL_MESSAGE)
private List<#NotBlank(message = ACTIVITY_NOT_BLANK_MESSAGE) String> activities;
}
I am using this builder to create an object of ActivitiesModel, and then validating it using Hibernate's Validator interface:
// Somewhere else in the application.
// Create an object using the builder method.
ActivitiesModel activitiesModel = ActivitiesModel.builder()
.activities(List.of("hello", "world")) // <----- Point A
.build();
// Validate the object using Hibernate's validator.
validator.validate(activitiesModel);
However, running this code gives me the following error:
java.lang.Error:
Unresolved compilation problem:
Type mismatch: cannot convert from List<String> to List<E>
The stack trace seems to be pointing at Point A.
I have tried the following approaches:
Replacing the #SuperBuilder with #Builder and #AllArgsConstructor.
Replacing the message attribute with a string literal instead of a static final variable, i.e:
private List<#NotBlank(message = "Activity cannot be blank") String> activities;
1st approach seems to fix this error, however, it's not something I can use as I need to extend the builder functionality to a subclass of ActivitiesModel. Also, this issue is also present in another abstract class, so the super builder functionality for parent classes is definitely required.
2nd approach also works in solving the error. However, going with it is a bit problematic because I then need to have the same message string in the validation test for this model class, which is something I would like to avoid as it duplicates the string.
Another thing to note is that this error only seems to occur in the presence of an annotation on the generic type parameter of the container, which is NotBlank in this case. It is not influenced by any annotations which are present directly on the field itself (NotNull in this case).
So, all in all, these are the questions that I would like to get some answers to:
Somehow, Lombok is able to figure out the types in case of a string literal but not in case of a static final String. Why is that?
Am I going about this totally wrong? The problem occurs because I'm trying to store the message string in a variable, and I'm trying to re-use the same variable at two places: the annotation's message attribute, and in the validation test for the model class. Should I not be checking for the presence of the message in my validation tests, but be checking for something else instead?
For anyone who comes across this later on, the research for this issue has led me to believe that comparing message strings in tests is not the way to go about writing validation test cases. Another downside to this approach is that you might have different validation messages for different locales. In that case, the message string itself might be a template e.g. my.message.key with its values in a ResourceBundle provided to Hibernate, i.e. files such as ValidationMessages.properties and ValidationMessages_de.properties.
In such a scenario, you could compare message for one locale in your validation test case, however, a better approach might be to check the annotation and the field for which the validation has failed. We can get both of these pieces of information via the ConstraintViolation and subsequently the ConstraintDescriptor types, provided by Hibernate. This way we can circumvent checking the message itself, but rely on the actual validation annotation which has failed.
As for the solution to this question, it seems it was a build cache issue. Cleaning maven's build cache results in this code working perfectly fine, but VSCode still seems to have an issue. For now, I will choose to ignore that.

querydsl-jpa dynamic query

Let's say I have a generated Entity like this:
public class QCandidate extends EntityPathBase<Candidate> {
public final com.avisto.candisearch.persistance.model.enums.QAvailabilityEnum availability;
public final DatePath<java.util.Date> birthDate = createDate("birthDate", java.util.Date.class);
public final NumberPath<Long> cvId= createNumber("cvId", Long.class);
public final StringPath city = createString("city");
}
My input values are the fields names ("availability","birthDate","cvId"...) and a string value that I should use to perform a 'like' with all the fields.
I want to build a query starting from the field names that:
casts Dates and Numbers to strings and lowercases them
if the field is an EntityPathBase (like availability) extracts the id and then again casts to lowercased string
Something like:
lower(cast(C.cvId as varchar(255))) like 'value'
for each field.
I can do this usign querydsl-sql module, but I want to achieve it using only the jpa module.
I'm not interested in the mechanism of creating the FULL 'where' clause (I know I have to use the BooleanBuilder, or at least, this is what I do in the sql version).
What I want to know is how to create the individual 'where' conditions basing on the field type.
I'm trying to use a PathBuilder but it seems that to use methods like "getString or getBoolean" you already have to know the type of the field that you are trying to extract. In my case, since I start just from the field name, I can't use these methods and I don't know how to identify the type of each field starting from the field name, so I'm stuck.
May be a bit ugly, but workable suggestion.
Note, that the number of field types that PathBuilder accepts is quite limited.
You definitely can find the field class from field name (using reflection or by maintaining a member map updated with each field).
Just implement handling for each specific type.
This can be ugly bunch of if..else or, for more elegant solution, create Map of type handlers [class->handler], each handler implements interface method to handle specific type.
Pseudocode:
//building query
for each field
Class fieldClass = findFieldClas(.., field) //use reflection or map
PathHandler handler = handlers.get(fieldClass)
handler.process( ...)
//type handler interface
public interface Handler{
public xx process(? extends DataPathBase);
}
//specific type handler implementation
public class BooleanHandler implements Handler{
public xx process(? extends DataPathBase path){
BooleanPath bPath = (BooleanPath)path;
...
}
//intitialize handlers map singleton or a factory in advance
handlers.put(BooleanPath.class, new BooleanHandler());
...
Note this is a generic solution if you have many classes. If you have only one specific class, you can just create a permanent map of fieldName->Handler and avoid lookup for the field class.
Again, this is by no means a pretty solution, but should work.

Java annotations: pass value of annotation attribute to another annotation

I have interface Resource and several classes implementing it, for example Audio, Video... Further, I have created custom annotation MyAnnotation with Class type param:
#MyAnnotation(type = Audio.class)
class Audio {
...
}
#MyAnnotation(type = Video.class)
class Video{
...
}
In some other place in code I have to use Interface Resource as a returned type:
public class Operations<T extends Resource> {
....
#OtherAnnotation(type = Audio.class (if audio), type = Video.class (if video) )
T getResource();
....
}
The question is how to appropriatelly annotate annotation #OtherAnnotation depending of what kind of Resource type will be returned ?
What you are asking is for dynamic values for annotation attributes.
However annotations can only be set at compile time which is the reason why their values can only be compile time constants. You may only read them at runtime.
There was a similar question in which someone tried to generate the annotation value , it's answer explains why there is no way to dynamically generate a value used in annotation in a bit more detail. In that question there was an attempt to use a final class variable generated with a static method.
There are annotation processors which offer a bit more flexibility by handling placeholders. However i don't think this fits your case, as you want the dynamic values at runtime.
This answer refers to spring's use of the expression language for the Value annotation in which the placeholder (#Value("#{systemProperties.dbName})") gets overrided with the data from one of the property sources defined ( example in spring boot )
In any case, you will have to rethink your architecture a bit.

Java : method that takes in argument any attribute of any class

I need to create a method that takes in argument any attribute of any class. But i dont want it to be of type String, to avoid refactoring problems while renaming an attribute and to get the errors in Markers Tab of eclipse, and not while running my application.
Having a class Person :
public class Person {
private String name;
// other attributes...
// getters and setters...
}
Now the needed method :
void getAnAttributeOfAClass( <which_type_or_class_here?> attr_as_arg){
// Now I need to get the name of attribute that would be of class Strin...
}
Is there a function or a method, by which we can specify an attribute?
For example :
Person.class.name
Would it be of class Property ?
EDIT
More exactly (#Smallhacker answer helped me), I need to verify at compile time if the argument is really an attribute of the specified class.
Person.class.name // no compile time error
Person.class.nameXXX // compile time error
The closest to what you want is Reflection API's Field or JavaBeans Introspector API's PropertyDescriptor.
But usually things like that are not needed in Java projects because there are libraries which handle these concerns.
You could pass a Class object along with a String name, then let your method use Introspector internally to read that property.
Not sure I understand you well, but there is a class java.lang.reflect.Field, that has a method getName() that would give your the name of the field.
In your example, to get field name, you would do: Person.class.getDeclaredField("name").
EDIT: to get the value of a field in an object, you would do: field.get(obj);
OK, let's say You have the following variables:
Person person = ...; // initialized with some Person
Field nameField = Person.class.getDeclaredField("name");
Now to get the name of person, you would do:
String personName = (String)nameField.get(person);
Actually, this would throw an exception because name is a private field. You can however bypass the protection by doing:
nameField.setAccessible(true);
Unfortunately, Java lacks an ability to reference member variables in a way that can be analyzed at compile time.
There may be some kind of library to simplify this somewhat, but it wouldn't provide a full solution due to limitations in the language itself.
Maybe java generics can help you with this.
You can do something like:
class YourClass<E> {
void getAnAttributeOfAClass(E attr_as_arg){
// some code
}
}
someVariable = new YourClass<Person>();
someVariable.getAnAtributeOfAClass(someObject); //this will not compile if someObject is not an instance of Person
But I still don't know what you want to do exactly inside the method.

Pass custom value to Reducer

I want/need to pass along the rowkey to the Reducer, as the rowkey is calculated in advance, and the information is not available anymore at that stage. (The Reducer executes a Put)
First I tried to just use inner classes, e.g.
public class MRMine {
private byte[] rowkey;
public void start(Configuration c, Date d) {
// calc rowkey based on date
TableMapReduceUtil.initTableMapperJob(...);
TableMapReduceUtil.initTableReducerJob(...);
}
public class MyMapper extends TableMapper<Text, IntWritable> {...}
public class MyReducer extends TableReducer<Text, IntWritable, ImmutableBytesWritable> {...}
}
and both MyMapper and MyReducer have the default constructor defined. But this approach leads to the following exception(s):
java.lang.RuntimeException: java.lang.NoSuchMethodException: com.mycompany.MRMine$MyMapper.<init>()
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:115)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:719)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:212)
Caused by: java.lang.NoSuchMethodException: com.company.MRMine$MyMapper.<init>()
at java.lang.Class.getConstructor0(Class.java:2730)
at java.lang.Class.getDeclaredConstructor(Class.java:2004)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:109)
I got rid of the exception by declaring the inner classes static (Runtimeexception: java.lang.NoSuchMethodException: tfidf$Reduce.<init>()) . but then I'd have to make the rowkey static as well, and I'm running multiple jobs in parallel.
I found https://stackoverflow.com/a/6739905/1338732 where the configure method of the Reducer is overwritten, but it doesn't seem to be available anymore. Anyhow, I wouldn't be able to pass along a value.
I was thinking of (mis)using (?) the Configuration, by just adding a new key-value pair, would this be working, and the correct approach?
Is there a way to pass along any custom value to the reducer?
the versions I'm using are: hbase: 0.94.6.1, hadoop: 1.0.4
Your problem statement is a little unclear, however I think something like this is what you are looking for.
The way I currently use to pass information to the reducer is to pass it in the configuration.
in the job setup do the following
conf.set("someName","someValue");
This will create a tag in the configuration that has name someName with value someValue. This can later be retrieved in the Mapper/Reducer by doing the following:
Configuration conf = context.getConfiguration();
String someVariable = conf.get("someName");
The current code will set the value of someVariable to "someValue", allowing the information to be passed to the reducer.
To pass multiple values use setStrings(). I haven't tested this function yet, but according to the documentation is should work with one of the following two options (the documentation is a little unclear, so try both and use whichever works):
conf.setStrings("someName","value1,value2,value3");
conf.setStrings("someName","value1","value2","value3");
retrieve using:
Configuration conf = context.getConfiguration();
String someVariable = conf.getStrings("someName");
Hope this helps
The goal is a little unclear, but I have found that for many types of jobs involving HBase, you do not need a reducer to put data into HBase. The mapper reads a row, modifies it in some way, then writes it back.
Obviously there are jobs for which that is inappropriate (any type of aggregation for example), but the reduce stage can really slow down a job.

Categories