How to use spring data with couchbase without _class attribute - java

Is there a simple way to use spring data couchbase with documents that do not have _class attribute?
In the couchbase I have something like this in my sampledata bucket:
{
"username" : "alice",
"created" : 1473292800000,
"data" : { "a": 1, "b" : "2"},
"type" : "mydata"
}
Now, is there any way to define mapping from this structure of document to Java object (note that _class attribute is missing and cannot be added) and vice versa so that I get all (or most) automagical features from spring couchbase data?
Something like:
If type field has value "mydata" use class MyData.java.
So when find is performed instead of automatically adding AND _class = "mydata" to generated query add AND type = "mydata".

Spring Data in general needs the _class field to know what to instantiate back when deserializing.
It's fairly easy in Spring Data Couchbase to use a different field name than _class, by overriding the typeKey() method in the AbsctractCouchbaseDataConfiguration.
But it'll still expect a fully qualified classname in there by default
Getting around that will require quite a bit more work:
You'll need to implement your own CouchbaseTypeMapper, following the model of DefaultCouchbaseTypeMapper. In the super(...) constructor, you'll need to provide an additional argument: a list of TypeInformationMapper. The default implementation doesn't explicitly provide one, so a SimpleTypeInformationMapper is used, which is the one that puts FQNs.
There's an alternative implementation that is configurable so you can alias specific classes to a shorter name via a Map: ConfigurableTypeInformationMapper...
So by putting a ConfigurableTypeInformationMapper with the alias you want for specific classes + a SimpleTypeInformationMapper after it in the list (for the case were you serialize a class that you didn't provide an alias for), you can achieve your goal.
The typeMapper is used within the MappingCouchbaseConverter, which you'll also need to extend unfortunately (just to instantiate your typeMapper instead of the default.
Once you have that, again override the configuration to return an instance of your custom MappingCouchbaseConverter that uses your custom CouchbaseTypeMapper (the mappingCouchbaseConverter() method).

You can achive this e.g. by creating custom annotation #DocumentType
#DocumentType("billing")
#Document
public class BillingRecordDocument {
String name;
// ...
}
Document will look like:
{
"type" : "billing"
"name" : "..."
}
Just create following classes:
Create custom AbstractReactiveCouchbaseConfiguration or AbstractCouchbaseConfiguration (depends which varian you use)
#Configuration
#EnableReactiveCouchbaseRepositories
public class CustomReactiveCouchbaseConfiguration extends AbstractReactiveCouchbaseConfiguration {
// implement abstract methods
// and configure custom mapping convereter
#Bean(name = BeanNames.COUCHBASE_MAPPING_CONVERTER)
public MappingCouchbaseConverter mappingCouchbaseConverter() throws Exception {
MappingCouchbaseConverter converter = new CustomMappingCouchbaseConverter(couchbaseMappingContext(), typeKey());
converter.setCustomConversions(customConversions());
return converter;
}
#Override
public String typeKey() {
return "type"; // this will owerride '_class'
}
}
Create custom MappingCouchbaseConverter
public class CustomMappingCouchbaseConverter extends MappingCouchbaseConverter {
public CustomMappingCouchbaseConverter(final MappingContext<? extends CouchbasePersistentEntity<?>,
CouchbasePersistentProperty> mappingContext, final String typeKey) {
super(mappingContext, typeKey);
this.typeMapper = new TypeBasedCouchbaseTypeMapper(typeKey);
}
}
and custom annotation #DocumentType
#Persistent
#Inherited
#Retention(RetentionPolicy.RUNTIME)
#Target({ElementType.TYPE})
public #interface DocumentType {
String value();
}
Then create TypeAwareTypeInformationMapper which will just check if an entity is annoatated by #DocumentType if so, use value from that annotation, do the default if not (fully qualified class name)
public class TypeAwareTypeInformationMapper extends SimpleTypeInformationMapper {
#Override
public Alias createAliasFor(TypeInformation<?> type) {
DocumentType[] documentType = type.getType().getAnnotationsByType(DocumentType.class);
if (documentType.length == 1) {
return Alias.of(documentType[0].value());
}
return super.createAliasFor(type);
}
}
Then register it as following
public class TypeBasedCouchbaseTypeMapper extends DefaultTypeMapper<CouchbaseDocument> implements CouchbaseTypeMapper {
private final String typeKey;
public TypeBasedCouchbaseTypeMapper(final String typeKey) {
super(new DefaultCouchbaseTypeMapper.CouchbaseDocumentTypeAliasAccessor(typeKey),
Collections.singletonList(new TypeAwareTypeInformationMapper()));
this.typeKey = typeKey;
}
#Override
public String getTypeKey() {
return typeKey;
}
}

In your couchbase configuration class you just need to have :
#Override
public String typeKey() {
return "type";
}
Unfortunately for query derivation (n1ql) the _class or type are still using the class name.Tried spring couch 2.2.6 and it's minus point here.
#Simon, are you aware that something has changed and the support to have the possibility to have custom _class/type value in next release(s)?

#SimonBasle
Inside of class N1qlUtils and method createWhereFilterForEntity we have access to the CouchbaseConverter. On line:
String typeValue = entityInformation.getJavaType().getName();
Why not use the typeMapper from the converter to get the name of the entity when we want to avoid using the class name? Otherwise you have to annotate each method in your repository as follows:
#Query("#{#n1ql.selectEntity} WHERE `type`='airport' AND airportname = $1")
List<Airport> findAirportByAirportname(String airportName);
If createWhereFilterForEntity used the CouchbaseConverter we could avoid annotating with the #Query.

Related

Access annotation attributes from custom oval annotation

Is it possible, when using custom oval annotation and custom class for check, to access the annotation and retrieve the used annotation attributes ?
Reference for oval: https://sebthom.github.io/oval/USERGUIDE.html#custom-constraint-annotations
Minimal example
Lets assume we have class Foo.
It has two annotated fields.
Each time, the annotation has a different myValue – a and b.
class Foo {
#CustomAnnotation(myValue = "a")
public String first;
#CustomAnnotation(myValue = "b")
public String second;
}
This is the annotation.
It is noted that a check should be performed using MyCheck.class, also setting some default value for myValue.
#Retention(RetentionPolicy.RUNTIME)
#Target({ElementType.FIELD, ElementType.PARAMETER, ElementType.METHOD})
#Constraint(checkWith = MyCheck.class)
public #interface CustomAnnotation {
String myValue() default "";
}
Now we want to use oval to validate this field.
Most importantly, we want to extract the value a or b from the annotation's myValue and use it inside our validation logic.
public class MyCheck extends AbstractAnnotationCheck<CustomAnnotation> {
#Override
public boolean isSatisfied(Object validatedObject, Object valueToValidate, OValContext context,
Validator validator) throws OValException {
// how to get the value of `myValue`, which is `a` or `b` or empty string as default
}
}
What I have tried and failed:
validatedObject is Foo.class. You can easily get its fields and annotations. However, there is no way to differentiate between the two annotations.
valueToValidate is in this case String value – what first or second holds.
context not useful, you can get compile time type from it, which is String
validator not useful ?
After some digging in the superclass I have found that you can override method
configure
This method gets as the only parameter the annotation that is currently being checked at the field.
You can then read the myValue.
public class MyCheck extends AbstractAnnotationCheck<CustomAnnotation> {
private String myValue;
#Override
public void configure(CustomAnnotation customAnnotation) {
super.configure(customAnnotation);
this.myValue = customAnnotation.myValue();
}
#Override
public boolean isSatisfied(Object validatedObject, Object valueToValidate, OValContext context,
Validator validator) throws OValException {
if (myValue.equals("a")) {}
else if (myValue.equals("b")){}
else {}
}

How to set tableName dynamically using environment variable in spring boot?

I am using AWS ECS to host my application and using DynamoDB for all database operations. So I'll have same database with different table names for different environments. Such as "dev_users" (for Dev env), "test_users" (for Test env), etc.. (This is how our company uses same Dynamo account for different environments)
So I would like to change the "tableName" of the model class using the environment variable passed through "AWS ECS task definition" environment parameters.
For Example.
My Model Class is:
#DynamoDBTable(tableName = "dev_users")
public class User {
Now I need to replace the "dev" with "test" when I deploy my container in test environment. I know I can use
#Value("${DOCKER_ENV:dev}")
to access environment variables. But I'm not sure how to use variables outside the class. Is there any way that I can use the docker env variable to select my table prefix?
My Intent is to use like this:
I know this not possible like this. But is there any other way or work around for this?
Edit 1:
I am working on the Rahul's answer and facing some issues. Before writing the issues, I'll explain the process I followed.
Process:
I have created the beans in my config class (com.myapp.users.config).
As I don't have repositories, I have given my Model class package name as "basePackage" path. (Please check the image)
For 1) I have replaced the "table name over-rider bean injection" to avoid the error.
For 2) I printed the name that is passing on to this method. But it is Null. So checking all the possible ways to pass the value here.
Check the image for error:
I haven't changed anything in my user model class as beans will replace the name of the DynamoDBTable when the beans got executed. But the table name over riding is happening. Data is pulling from the table name given at the Model Class level only.
What I am missing here?
The table names can be altered via an altered DynamoDBMapperConfig bean.
For your case where you have to Prefix each table with a literal, you can add the bean as such. Here the prefix can be the environment name in your case.
#Bean
public TableNameOverride tableNameOverrider() {
String prefix = ... // Use #Value to inject values via Spring or use any logic to define the table prefix
return TableNameOverride.withTableNamePrefix(prefix);
}
For more details check out the complete details here:
https://github.com/derjust/spring-data-dynamodb/wiki/Alter-table-name-during-runtime
I am able to achieve table names prefixed with active profile name.
First added TableNameResolver class as below,
#Component
public class TableNameResolver extends DynamoDBMapperConfig.DefaultTableNameResolver {
private String envProfile;
public TableNameResolver() {}
public TableNameResolver(String envProfile) {
this.envProfile=envProfile;
}
#Override
public String getTableName(Class<?> clazz, DynamoDBMapperConfig config) {
String stageName = envProfile.concat("_");
String rawTableName = super.getTableName(clazz, config);
return stageName.concat(rawTableName);
}
}
Then i setup DynamoDBMapper bean as below,
#Bean
#Primary
public DynamoDBMapper dynamoDBMapper(AmazonDynamoDB amazonDynamoDB) {
DynamoDBMapper mapper = new DynamoDBMapper(amazonDynamoDB,new DynamoDBMapperConfig.Builder().withTableNameResolver(new TableNameResolver(envProfile)).build());
return mapper;
}
Added variable envProfile which is an active profile property value accessed from application.properties file.
#Value("${spring.profiles.active}")
private String envProfile;
We have the same issue with regards to the need to change table names during runtime. We are using Spring-data-dynamodb 5.0.2 and the following configuration seems to provide the solutions that we need.
First I annotated my bean accessor
#EnableDynamoDBRepositories(dynamoDBMapperConfigRef = "getDynamoDBMapperConfig", basePackages = "my.company.base.package")
I also setup an environment variable called ENV_PREFIX which is Spring wired via SpEL.
#Value("#{systemProperties['ENV_PREFIX']}")
private String envPrefix;
Then I setup a TableNameOverride bean:
#Bean
public DynamoDBMapperConfig.TableNameOverride getTableNameOverride() {
return DynamoDBMapperConfig.TableNameOverride.withTableNamePrefix(envPrefix);
}
Finally, I setup the DynamoDBMapperConfig bean using TableNameOverride injection. In 5.0.2, we had to setup a standard DynamoDBTypeConverterFactory in the DynamoDBMapperConfig builder to avoid NPE.:
#Bean
public DynamoDBMapperConfig getDynamoDBMapperConfig(DynamoDBMapperConfig.TableNameOverride tableNameOverride) {
DynamoDBMapperConfig.Builder builder = new DynamoDBMapperConfig.Builder();
builder.setTableNameOverride(tableNameOverride);
builder.setTypeConverterFactory(DynamoDBTypeConverterFactory.standard());
return builder.build();
}
In hind sight, I could have setup a DynamoDBTypeConverterFactory bean that returns a standard DynamoDBTypeConverterFactory and inject that into the getDynamoDBMapperConfig() method using the DynamoDBMapperConfig builder. But this will also do the job.
I up voted the other answer but here is an idea:
Create a base class with all your user details:
#MappedSuperclass
public abstract class AbstractUser {
#Id
#GeneratedValue(strategy=GenerationType.AUTO)
private Long id;
private String firstName;
private String lastName;
Create 2 implentations with different table names and spirng profiles:
#Profile(value= {"dev","default"})
#Entity(name = "dev_user")
public class DevUser extends AbstractUser {
}
#Profile(value= {"prod"})
#Entity(name = "prod_user")
public class ProdUser extends AbstractUser {
}
Create a single JPA respository that uses the mapped super classs
public interface UserRepository extends CrudRepository<AbstractUser, Long> {
}
Then switch the implentation with the spring profile
#RunWith(SpringJUnit4ClassRunner.class)
#DataJpaTest
#Transactional
public class UserRepositoryTest {
#Autowired
protected DataSource dataSource;
#BeforeClass
public static void setUp() {
System.setProperty("spring.profiles.active", "prod");
}
#Test
public void test1() throws Exception {
DatabaseMetaData metaData = dataSource.getConnection().getMetaData();
ResultSet tables = metaData.getTables(null, null, "PROD_USER", new String[] { "TABLE" });
tables.next();
assertEquals("PROD_USER", tables.getString("TABLE_NAME"));
}
}

Dynamically injecting generic objects with guice

My current situation:
I want to inject the following class into my application:
public interface IConfigAccessor<T extends IConfig> {
...
}
ConfigAccessors are a proxy-objects, created dynamically at runtime. The creation of these object works as follows:
public class ConfigFactory implements IConfigFactory {
private final IConfigUpdater updater;
#Inject
public ConfigFactory(IConfigUpdater updater) {
this.updater = updater;
}
#Override
public <T extends IConfig> IConfigAccessor<T> register(final String configKey, final Class<T> configClass) {
ConfigCache<T> configCache = new ConfigCache<>(new SomeOtherThings(), configKey, configClass);
updater.register(configCache);
return new ConfigAccessor<>(configCache, configKey, configClass);
}
}
As you can see, to create these objects, I need to inject the ConfigUpdater and other depdencies. This means, that guice needs to be fully configured already.
To get the instance out of Guice, I use the following code:
IConfigFactory configClient = injector.getInstance(IConfigFactory.class);
IConfigAccessor<ConcreteConfig> accessor = configClient.register("key", ConcreteConfig.class)
How I want to inject them via Guice:
Currently, I can get the requried objects, but I have to manually pass them around in my application.
Instead, what I want to have is the following:
public class SomeClass {
#Inject
public SomeClass(#Config(configKey="key") IConfigAccessor<ConcreteConfig> accessor) {
// hurray!
}
}
What's the correct approach/technology to get this working?
After a lot of research, I'm feeling a bit lost on how to approach this topic. There are a lot of different things Guice offers, including simple Providers, custom Listeners which scan classes and identify custom annotations, FactoryModuleBuilders and more.
My problem is quite specific, and I'm not sure which of these things to use and how to get it working. I'm not even sure if this is even possible with Guice?
Edit: What I have so far
I have the following annotation which I want to use inside constructor paramters:
#Target({ ElementType.FIELD, ElementType.PARAMETER })
#Retention(RetentionPolicy.RUNTIME)
public #interface InjectConfig {
String configKey();
}
Inside the module, I can bind a provider to IConfigAccessor (with the above annotation) as such:
bind(IConfigAccessor.class).annotatedWith(InjectConfig.class)
.toProvider(new ConfigProvider<>());
However, there are two problems whith this:
The provider cannot provide IConfigAccessor. To create such an instance, the provider would need an IConfigUpdater, but since I use 'new' for the provider, I can't inject it.
Inside the provider, there is no way to find out about the configKey used in the Annotation.
Second approach:
Let's assume that I already know all configurations and configKeys I want to inject during startup. In this case, I could loop over all possible configKeys and have the following binding:
String configKey = "some key";
final Class<? extends IConfig> configClass =...;
bind(IConfigAccessor.class).annotatedWith(Names.named(configKey))
.toProvider(new ConfigProvider<>(configKey, configClass));
However, problem (1) still resides: The provider cannot get an IConfigUpdater instance.
The main problem here is that you cannot use the value of the annotation in the injection. There is another question which covers this part:
Guice inject based on annotation value
Instead of binding a provider instance, you should bind the provider class, and get the class by injecting a typeliteral.
That way, your config factory can look like that:
public class ConfigFactory<T extends IConfig> implements IConfigFactory {
#Inject private final IConfigUpdater updater;
#Inject private TypeLiteral<T> type;
#Override
public IConfigAccessor<T> register(final String configKey) {
Class<T> configClass = (Class<T>)type.getRawType();
ConfigCache<T> configCache = new ConfigCache<>(new SomeOtherThings(), configKey, configClass);
updater.register(configCache);
return new ConfigAccessor<>(configCache, configKey, configClass);
}
}
And then SomeClass:
public class SomeClass {
#Inject
public SomeClass(ConfigFactory<ConcreteConfig> accessor) {
ConcreteConfig config = accessor.register("key");
}
}
Since SomeClass needs to know "key" anyway, this is not too much a change information-wise. The downside is that the SomeClass API now gets a factory instead of the concrete config.
[EDIT]
And here is someone who actually did inject annotated values using custom injection.

#ConditionalOnProperty for lists or arrays?

I'm using Spring Boot 1.4.3 #AutoConfiguration where I create beans automatically based on properties user specifies. User can specify an array of services, where name and version are required fields:
service[0].name=myServiceA
service[0].version=1.0
service[1].name=myServiceB
service[1].version=1.2
...
If the user forgets to specify a required field on even just one service, I want to back-off and not create any beans. Can I accomplish this with #ConditionalOnProperty? I want something like:
#Configuration
#ConditionalOnProperty({"service[i].name", "service[i].version"})
class AutoConfigureServices {
....
}
This is the custom Condition I created. It needs some polishing to be more generic (ie not hardcoding strings), but worked great for me.
To use, I annotated my Configuration class with #Conditional(RequiredRepeatablePropertiesCondition.class)
public class RequiredRepeatablePropertiesCondition extends SpringBootCondition {
private static final Logger LOGGER = LoggerFactory.getLogger(RequiredRepeatablePropertiesCondition.class.getName());
public static final String[] REQUIRED_KEYS = {
"my.services[i].version",
"my.services[i].name"
};
#Override
public ConditionOutcome getMatchOutcome(ConditionContext context, AnnotatedTypeMetadata metadata) {
List<String> missingProperties = new ArrayList<>();
RelaxedPropertyResolver resolver = new RelaxedPropertyResolver(context.getEnvironment());
Map<String, Object> services = resolver.getSubProperties("my.services");
if (services.size() == 0) {
missingProperties.addAll(Arrays.asList(REQUIRED_KEYS));
return getConditionOutcome(missingProperties);
}
//gather indexes to check: [0], [1], [3], etc
Pattern p = Pattern.compile("\\[(\\d+)\\]");
Set<String> uniqueIndexes = new HashSet<String>();
for (String key : services.keySet()) {
Matcher m = p.matcher(key);
if (m.find()) {
uniqueIndexes.add(m.group(1));
}
}
//loop each index and check required props
uniqueIndexes.forEach(index -> {
for (String genericKey : REQUIRED_KEYS) {
String multiServiceKey = genericKey.replace("[i]", "[" + index + "]");
if (!resolver.containsProperty(multiServiceKey)) {
missingProperties.add(multiServiceKey);
}
}
});
return getConditionOutcome(missingProperties);
}
private ConditionOutcome getConditionOutcome(List<String> missingProperties) {
if (missingProperties.isEmpty()) {
return ConditionOutcome.match(ConditionMessage.forCondition(RequiredRepeatablePropertiesCondition.class.getCanonicalName())
.found("property", "properties")
.items(Arrays.asList(REQUIRED_KEYS)));
}
return ConditionOutcome.noMatch(
ConditionMessage.forCondition(RequiredRepeatablePropertiesCondition.class.getCanonicalName())
.didNotFind("property", "properties")
.items(missingProperties)
);
}
}
Old question, but I hope my answer will help for Spring2.x:
Thanks to #Brian, I checked migration guide, where I was inspired by example code. This code works for me:
final List<String> services = Binder.get(context.getEnvironment()).bind("my.services", List.class).orElse(null);
I did try to get List of POJO (as AutoConfigureService) but my class differs from AutoConfigureServices. For that purpose, I used:
final Services services = Binder.get(context.getEnvironment()).bind("my.services", Services.class).orElse(null);
Well, keep playing :-D
Here's my take on this issue with the use of custom conditions in Spring autoconfiguration. Somewhat similar to what #Strumbels proposed but more reusable.
#Conditional annotations are executed very early in during the application startup. Properties sources are already loaded but ConfgurationProperties beans are not yet created. However we can work around that issue by binding properties to Java POJO ourselves.
First I introduce a functional interface which will enable us to define any custom logic checking if properties are in fact present or not. In your case this method will take care of checking if the property List is empty/null and if all items within are valid.
public interface OptionalProperties {
boolean isPresent();
}
Now let's create an annotation which will be metannotated with Spring #Conditional and allow us to define custom parameters. prefix represents the property namespace and targetClass represents the configuration properties model class to which properties should be mapped.
#Target({ElementType.TYPE, ElementType.METHOD})
#Retention(RetentionPolicy.RUNTIME)
#Documented
#Conditional(OnConfigurationPropertiesCondition.class)
public #interface ConditionalOnConfigurationProperties {
String prefix();
Class<? extends OptionalProperties> targetClass();
}
And now the main part. The custom condition implementation.
public class OnConfigurationPropertiesCondition extends SpringBootCondition {
#Override
public ConditionOutcome getMatchOutcome(ConditionContext context, AnnotatedTypeMetadata metadata) {
MergedAnnotation<ConditionalOnConfigurationProperties> mergedAnnotation = metadata.getAnnotations().get(ConditionalOnConfigurationProperties.class);
String prefix = mergedAnnotation.getString("prefix");
Class<?> targetClass = mergedAnnotation.getClass("targetClass");
// type precondition
if (!OptionalProperties.class.isAssignableFrom(targetClass)) {
return ConditionOutcome.noMatch("Target type does not implement the OptionalProperties interface.");
}
// the crux of this solution, binding properties to Java POJO
Object bean = Binder.get(context.getEnvironment()).bind(prefix, targetClass).orElse(null);
// if properties are not present at all return no match
if (bean == null) {
return ConditionOutcome.noMatch("Binding properties to target type resulted in null value.");
}
OptionalProperties props = (OptionalProperties) bean;
// execute method from OptionalProperties interface
// to check if condition should be matched or not
// can include any custom logic using property values in a type safe manner
if (props.isPresent()) {
return ConditionOutcome.match();
} else {
return ConditionOutcome.noMatch("Properties are not present.");
}
}
}
Now you should create your own configuration properties class implementing OptionalProperties interface.
#ConfigurationProperties("your.property.prefix")
#ConstructorBinding
public class YourConfigurationProperties implements OptionalProperties {
// Service is your POJO representing the name and version subproperties
private final List<Service> services;
#Override
public boolean isPresent() {
return services != null && services.stream().all(Service::isValid);
}
}
And then in Spring #Configuration class.
#Configuration
#ConditionalOnConfigurationProperties(prefix = "", targetClass = YourConfigurationProperties.class)
class AutoConfigureServices {
....
}
There are two downsides to this solution:
Property prefix must be specified in two locations: on #ConfigurationProperties annotation and on #ConditionalOnConfigurationProperties annotation. This can partially be alleviated by defining a public static final String PREFIX = "namespace" in your configuration properties POJO.
Property binding process is executed separately for each use of our custom conditional annotation and then once again to create the configuration properties bean itself. It happens only during app startup so it shouldn't be an issue but it still is an inefficiency.
You can leverage the org.springframework.boot.autoconfigure.condition.OnPropertyListCondition class. For example, given you want to check for the service property having at least one value:
class MyListCondition extends OnPropertyListCondition {
MyListCondition() {
super("service", () -> ConditionMessage.forCondition("service"));
}
}
#Configuration
#Condition(MyListCondition.class)
class AutoConfigureServices {
}
See the org.springframework.boot.autoconfigure.webservices.OnWsdlLocationsCondition used on org.springframework.boot.autoconfigure.webservices.WebServicesAutoConfiguration#wsdlDefinitionBeanFactoryPostProcessor for an example within Spring itself.

Jackson JSON library: how to instantiate a class with abstract fields that can't access its concrete representation?

This is the same questions than :
Jackson JSON library: how to instantiate a class that contains abstract fields
Nevertheless its solution is not possible since my abstract class is in another project than the concrete one.
Is there a way then ?
EDIT
My architecture is as follows:
public class UserDTO {
...
private LanguageDTO lang;
}
I send that object user :
restTemplate.postForObject(this.getHttpCore().trim() + "admin/user/save/1/" + idUser, userEntity, UserDTO.class);
Then I am supposed to receive it in the function :
#RequestMapping(value = "/save/{admin}/{idUser}", method = RequestMethod.POST)
public String saveUserById(#RequestBody final UserEntity user, #PathVariable Integer idUser, #PathVariable boolean admin)
with UserEntity defined as :
public class UserEntity extends AbstractUserEntity {
...
}
public abstract class AbstractUserEntity {
...
private AbstractLanguageEntity lang;
}
I would like to know how I can specify that lang should be instantiate as LanguageEntity whereas abstract classes are in another project.
This could work assuming you can configure how the object get serialized. See the example here. Look under "1.1. Global default typing" to set the defaults to include extra information in your JSON string, basically the concrete Java type that must be used when deserializing.
Since it seems you need to do this for your Spring servlet, you would have to pass a Spring message converter as mentioned here
Then inside your custom objectMapper, you can do the necessary configuration:
public class JSONMapper extends ObjectMapper {
public JSONMapper() {
this.enableDefaultTyping();
}
}
You could probably also make it work with Mix-ins, which allow you to add annotations to classes already defined. You can see and example here. This will also need to be configured inside the objectMapper.
If you need the same functionality on your client side (REST template), you can pass the object mapper as shown here.
The easiest way to solve that issue is to add getters et setters in UserEntity but specifying a concrete class :
public LanguageEntity getLang() {
return (LanguageEntity) lang;
}
public void setLang(LanguageEntity language){
this.lang = language
}
If all that you want to achieve is to note that LanguageEntity is the implementation of AbstractLanguageEntity, you can register this mapping via module:
SimpleModule myModule = new SimpleModule())
.addAbstractTypeMapping(AbstractLanguageEntity.class,
LanguageEntity.class);
ObjectMapper mapper = new ObjectMapper()
.registerMdoule(myModule);

Categories