graphql-spqr can not query the parent class field - java

I tried to implement entity classes using polymorphism.
This is my BaseEntity
#Getter
#Setter
#Accessors(chain = true)
#MappedSuperclass
#NoArgsConstructor
#AllArgsConstructor
#EntityListeners(AuditingEntityListener.class)
public class BaseEntity {
#Id
#GeneratedValue(strategy = GenerationType.IDENTITY)
private Long id;
#Size(max = 55, message = "name length more then 55")
private String name;
#Size(max = 255, message = "remark length more than 255")
private String remark;
}
And my entity
#Data
#NoArgsConstructor
#Table(name = "sys_user")
#Entity(name = "sys_user")
#Accessors(chain = true)
#ToString(callSuper = true)
#EqualsAndHashCode(callSuper = true)
public class SysUser extends BaseEntity implements Serializable {
#NonNull
private String username;
#NonNull
private String password;
}
In my controller
#Controller
#GraphQLApi
#RequiredArgsConstructor
public class SysUserController implements BaseController {
private final SysUserRepository sysUserRepository;
#GraphQLQuery
public List<SysUser> sysUsers() {
return sysUserRepository.findAll();
}
}
My GraphQL Config
#Configuration
#RequiredArgsConstructor
public class GraphQLConfig {
private final #NotNull List<BaseController> controllerLists;
#Bean
public GraphQLSchema graphqlSchema() {
GraphQLSchemaGenerator generator = new GraphQLSchemaGenerator();
generator.withOperationsFromSingletons(controllerLists.toArray());
return generator.generate();
}
}
Now, I try to get
{
sysUsers {
username
}
}
The result is right
{
"data": {
"sysUsers": [
{
"username": "Hello"
}
]
}
}
But I try to get the parent class field:
{
sysUsers {
name
}
}
I will get a error
{
"errors": [
{
"message": "Validation error of type FieldUndefined: Field 'name' in type 'SysUser' is undefined # 'sysUsers/name'",
"locations": [
{
"line": 3,
"column": 5
}
]
}
]
}
I use io.leangen.graphql:graphql-spqr-spring-boot-starter:0.0.4
How to resolve this question?
Thanks!

Inherited fields will only be exposed if they're within the configured packages. This way, you don't accidentally expose framework fields, JDK fields (like hashCode) etc. If no base packages are configured, SPQR will stay within the package the directly exposed class is.
To configure the base packages, add something like:
graphql.spqr.base-packages=your.root.package,your.other.root.package
to your application.properties file.
Note: These rules will get relaxed in the next release of SPQR, so that all non-JDK fields are exposed by default, as the current behavior seems to confuse too many people.

I'd recommend you to add auto-generation of classes based on the types defined in your graphql schema.
It will provide you more clarity on what is exposed to the user and avoid such errors in future.
Here are the plugins:
Gradle plugin: graphql-java-codegen-gradle-plugin
Maven plugin: grapqhl-java-codegen-maven-plugin

Related

How can I make javax validation both of sub class and super class?

I'm using Spring framework,
and I faced the inheritance problem when I write Controller logic.
First of all,
this is my Controller code snippet
#PostMapping("/pay/detail")
public ResponseEntity<PayDetail.Response> getPayDetail(
#Valid PayDetail.Request payDetail
) {
... some code
}
and PayDetail class looks like this
public class PayDetail {
#Getter
#Setter
#NoArgsConstructor
#AllArgsConstructor
public static class Request extends CommReqForm {
#NotNull(message = "Not null please")
private String work_type;
}
}
and CommReqForm
#Data
#AllArgsConstructor
#NoArgsConstructor
public class CommReqForm {
#NotEmpty(message = "servicecode not empty")
private String servicecode;
#NotEmpty(message = "reqtype not empty")
private String reqtype;
}
I wish that I can validate both of PayDetail.Request and CommReqForm classes but It makes validation just only PayDetail.Request class.
How can I solve this problem?
#Valid cannot validate super class. I want to make both of sub class and super class validation.

Post request transform int field to string automatically

I'm doing a dummy app of a hostpital. The problem I'm having is that, I'm trying to verify that when a Patient is created, the fields passed are of the correct type, but whenever I POST an Int in a String field, it doesn't fail and just transform the Int to String. The field I'm trying to make fail is "surname", which by the definition of the Patient class, is a String.
If I do this (I pass a number to the "surname" field):
{
"name": "John",
"surname": 43,
"sickness": "headache"
}
It just transforms 43 into a String by the time its in the Controller method.
Here we have the Patient object:
#Data
#Entity
#NoArgsConstructor
#AllArgsConstructor
public class Patient implements Serializable {
private static final long serialVersionUID = 4518011202924886996L;
#Id
//TODO: posible cambiar luego la generationType
#GeneratedValue(strategy = GenerationType.AUTO)
#Column(name = "patient_id")
private Long id;
#Column(name = "patient_name")
#JsonProperty(required = true)
private String name;
#Column(name = "patient_surname")
#JsonProperty(required = true)
private String surname;
#Column(name = "patient_sickness")
#JsonProperty(required = true)
private String sickness;
}
And this is the controller class:
#Controller
#Path("/patient")
#Produces(MediaType.APPLICATION_JSON + ";charset=utf-8")
public class PatientController {
#POST
#Path("")
public ResponseEntity<Object> postPatient(final Patient patient) {
ResponseEntity<Object> createdPatient = patientBusiness.createPatient(patient);
return new ResponseEntity<Patient>(createdPatient.getBody(), createdPatient.getStatusCode());
}
EDIT 1:
Following the "clues" and closing the circle of attention, I tried modifying the ObjectMapper, but my configuration isn't applying. I'm still getting the error from above.
This is the config class:
#Configuration
public class JacksonConfig {
#Bean
#Primary
public ObjectMapper getModifiedObjectMapper() {
ObjectMapper mapper = new ObjectMapper();
mapper.configure(MapperFeature.ALLOW_COERCION_OF_SCALARS, false);
mapper.coercionConfigFor(LogicalType.Integer).setCoercion(CoercionInputShape.String, CoercionAction.Fail);
return mapper;
}
}
Even added the property to the application.yml, but still nothing:
spring:
jackson:
mapper:
allow-coercion-of-scalars: false
Any help is appreciated. Thx.
In the end I referred to this post to do a deserializer and a module to just have it along all the program, not just the field I want not to be transformed.
Disable conversion of scalars to strings when deserializing with Jackson

Getting DynamoDBMappingException: not supported; requires #DynamoDBTyped or #DynamoDBTypeConverted for multilevel/complex object

I have a complex object Test in the entity class Item.
#AllArgsConstructor
#Getter
public enum TestStatus {
TO_RUN("To Run"),
RUNNING("Running"),
PASSED("Passed"),
FAILED("Failed");
public static TestStatus fromValue(String value) {
//...implementation
}
private final String value;
}
#Data
#ToString
#Accessors(chain = true)
#DynamoDBFlattened(attributes = {
#DynamoDBAttribute(attributeName = "test.task.id", mappedBy = "id"),
#DynamoDBAttribute(attributeName = "test.task.status", mappedBy = "status")
})
public class TestTask {
private String id;
#DynamoDBTypeConvertedEnum
private TestStatus status;
}
#Data
#ToString
#Accessors(chain = true)
#DynamoDBFlattened(attributes = {
#DynamoDBAttribute(attributeName = "test.suite.name", mappedBy = "name"),
#DynamoDBAttribute(attributeName = "test.suite.version", mappedBy = "version")
})
public class TestSuite {
private String name;
private String version;
}
#Data
#ToString
#Accessors(chain = true)
public class Test {
private TestSuite suite;
private TestTask task;
}
#Data
#ToString
#Accessors(chain = true)
#DynamoDBTable(tableName = "com.example.item")
public class Item {
private String name;
private Test test; // This is a complex object as structure given above.
}
On the call of dynamoDBMapper.save(item); getting exception.
#Repository
#RequiredArgsConstructor
public class DynamoDBItemRepository implements ItemRepository {
//...
#Override
public Item save(Item item) {
dynamoDBMapper.save(item); // Getting DynamoDBMappingException: not supported; requires #DynamoDBTyped or #DynamoDBTypeConverted
return item;
}
//...
}
I am getting the exception
com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMappingException: not supported; requires #DynamoDBTyped or #DynamoDBTypeConverted
at com.amazonaws.services.dynamodbv2.datamodeling.StandardModelFactories$Rules$NotSupported.set(StandardModelFactories.java:664) ~[aws-java-sdk-dynamodb-1.11.578.jar:?]
What am I missing? Please help!
There are two problems in the code.
I tried to reproduce the error, but found the first problem: no hash key specified.
so I used Item.name as the hash key in order to go further on the test.
The second problem matched your description
com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMappingException: not supported; requires #DynamoDBTyped or #DynamoDBTypeConverted
Found out that you missed an annotation #DynamoDBDocument, which should be added to the class Test since it is a nested type:
...
#DynamoDBDocument
...
public class Test {
see document here
I suggest to migrate to AWS SDK for Java 2.0 where you can use complex objects: doc

Javax validation on nested objects - not working

In my Spring Boot project I have two DTO's which I'm trying to validate, LocationDto and BuildingDto. The LocationDto has a nested object of type BuildingDto.
These are my DTO's:
LocationDto
public class LocationDto {
#NotNull(groups = { Existing.class })
#Null(groups = { New.class })
#Getter
#Setter
private Integer id;
#NotNull(groups = { New.class, Existing.class })
#Getter
#Setter
private String name;
#NotNull(groups = { New.class, Existing.class, LocationGroup.class })
#Getter
#Setter
private BuildingDto building;
#NotNull(groups = { Existing.class })
#Getter
#Setter
private Integer lockVersion;
}
BuildingDto
public class BuildingDto {
#NotNull(groups = { Existing.class, LocationGroup.class })
#Null(groups = { New.class })
#Getter
#Setter
private Integer id;
#NotNull(groups = { New.class, Existing.class })
#Getter
#Setter
private String name;
#NotNull(groups = { Existing.class })
#Getter
#Setter
private List<LocationDto> locations;
#NotNull(groups = { Existing.class })
#Getter
#Setter
private Integer lockVersion;
}
Currently, I can validate in my LocationDto that the properties name and building are not null, but I can't validate the presence of the property id which is inside building.
If I use the #Valid annotation on the building property, it would validate all of its fields, but for this case I only want to validate its id.
How could that be done using javax validation?
This is my controller:
#PostMapping
public LocationDto createLocation(#Validated({ New.class, LocationGroup.class }) #RequestBody LocationDto location) {
// save entity here...
}
This is a correct request body: (should not throw validation errors)
{
"name": "Room 44",
"building": {
"id": 1
}
}
This is an incorrect request body: (must throw validation errors because the building id is missing)
{
"name": "Room 44",
"building": { }
}
Just try adding #valid to collection. it would be working as per reference hibernate
#Getter
#Setter
#Valid
#NotNull(groups = { Existing.class })
private List<LocationDto> locations;
#Valid annotation must be added to cascade class attributes.
LocationDTO.class
public class LocationDto {
#Valid
private BuildingDto building;
.........
}
Use #ConvertGroup from Bean Validation 1.1 (JSR-349).
Introduce a new validation group say Pk.class. Add it to groups of BuildingDto:
public class BuildingDto {
#NotNull(groups = {Pk.class, Existing.class, LocationGroup.class})
// Other constraints
private Integer id;
//
}
And then in LocationDto cascade like following:
#Valid
#ConvertGroup.List( {
#ConvertGroup(from=New.class, to=Pk.class),
#ConvertGroup(from=LocationGroup.class, to=Pk.class)
} )
// Other constraints
private BuildingDto building;
Further Reading:
5.5. Group conversion from Hibernate Validator reference.

Java to JSON serialization with Jackson PTH and Spring Data MongoDB DBRef generates extra target property

When serializing from Java to JSON, Jackson generates an extra target property for referenced entities when using the Spring Data MongoDB #DBRef annotation with lazy loading and Jackson’s polymorphic type handling. Why does this occur, and is it possible to omit the extra target property?
Code Example
#Document(collection = "cdBox")
public class CDBox {
#Id
public String id;
#DBRef(lazy = true)
public List<Product> products;
}
#Document(collection = "album")
public class Album extends Product {
#DBRef(lazy = true)
public List<Song> songs;
}
#Document(collection = "single")
public class Single extends Product {
#DBRef(lazy = true)
public List<Song> songs;
}
#Document(collection = "song")
public class Song {
#Id
public String id;
public String title;
}
#JsonTypeInfo(use = JsonTypeInfo.Id.NAME,
property = "productType",
include = JsonTypeInfo.As.EXTERNAL_PROPERTY)
#JsonSubTypes(value = {
#JsonSubTypes.Type(value = Single.class),
#JsonSubTypes.Type(value = Album.class)
})
public abstract class Product {
#Id
public String id;
}
Generated JSON
{
"id": "someId1",
"products": [
{
"id": "someId2",
"songs": [
{
"id": "someId3",
"title": "Some title",
"target": {
"id": "someId3",
"title": "Some title"
}
}
]
}
]
}
The Target field is added by Spring Data because it is a lazy collection. So it is like datahandler etc. in Hibernate for JPA.
Option1:
To ignore them you just have to add #JsonIgnoreProperties(value = { "target" }) on class level
#Document(collection = "song")
#JsonIgnoreProperties(value = { "target" })
public class Song {
...
}
Option2:
Make the Collection not lazy

Categories