How do I prefix class names wih a custom string? - java

In order to implement some kind of namespace, I need to prefix the keys of a Redis JPA repository with a static string within a whole Spring application.
I read about the spring.cache.redis.key-prefix configuration option but it seems to be applicable to caches only.
How do I get the same behavior for JPA repositories?

In your #EnableRedisReposiories you can do:
#EnableRedisRepositories(keyspaceConfiguration = MyCustomKeyspaceConfiguration.class)
Then in the App config add a RedisMappingContext bean and the customer keyspace configuration class:
#Bean
public RedisMappingContext keyValueMappingContext() {
return new RedisMappingContext(
new MappingConfiguration(new IndexConfiguration(), new MyCustomKeyspaceConfiguration()));
}
public static class MyCustomKeyspaceConfiguration extends KeyspaceConfiguration {
#Override
protected Iterable<KeyspaceSettings> initialConfiguration() {
List<KeyspaceSetting> settings = new ArrayList<KeyspaceSetting>();
settings.add(new KeyspaceSetting(Foo.class, "my-prefix" + Foo.class.getName()));
return settings;
}
}
In the case above we're saying that for the class Foo prefix the keys with "my-prefix". KeyspaceConfiguration allows for the programmatic setup of keyspaces and time to live options for certain types.

Related

How to retrieve custom annotation fields?

I would like to implement a custom annotation that could be applied to a class (once inside an app), to enable a feature (Access to remote resources). If this annotation is placed on any config class, it will set the access for the whole app. So far it isn't that hard (see example below), but I want to include some definition fields in the #interface that will be used in the access establishing process.
As an example, Spring has something very similar: #EnableJpaRepositories. Access is enabled to the DB, with parameters in the annotation containing definitions. For example: #EnableJpaRepositories(bootstrapMode = BootstrapMode.DEFERRED)
So far, I have:
To create only the access I'm using something like that:
#Target(ElementType.TYPE)
#Retention(RetentionPolicy.RUNTIME)
#Import(AccessHandlerConfiguration.class)
public #interface EnableAccessHandlerAutoconfigure {
String name() default "";
}
Using it:
#EnableAccessHandlerAutoconfigure{name="yoni"}
#Configuration
public class config {}
AccessHandlerConfiguration is a configuration class that contains beans that establish the connection.
The problem I'm having is that I don't know how to retrieve the field name's value. What should I do?
Retrieving the value may be accomplished as follows:
this.getClass().getAnnotation(EnableAccessHandlerAutoconfigure.class).name()
To expand on my comment with an actual example configuration class that uses this:
#EnableAccessHandlerAutoconfigure(name="yoni")
#Configuration
public class SomeConfiguration {
#Bean
SomeBean makeSomeBean() {
return new SomeBean(this.getClass().getAnnotation(EnableAccessHandlerAutoconfigure.class).name());
}
}
This is how you get the value of name, as to what you are going to do next, that depends on you.
After a long research, I found a way: There is a method in Spring's ApplicationContext that retrieves bean names according to their annotations getBeanNamesForAnnotation, then get the annotation itself findAnnotationOnBean, and then simply use the field getter.
#Configuration
public class AccessHandlerConfiguration {
private final ApplicationContext applicationContext;
public AccessHandlerConfiguration(ApplicationContext applicationContext) {
this.applicationContext = applicationContext;
String[] beansWithTheAnnotation = applicationContext.getBeanNamesForAnnotation(EnableRabbitAutoconfigure.class);
for (String beanName : beansWithTheAnnotation) {
EnableRabbitAutoconfigure annotationOnBean = applicationContext.findAnnotationOnBean(beanName, EnableRabbitAutoconfigure.class);
System.out.println("**********" + beanName + "*********************" + annotationOnBean.name() + "*******************");
}
}
}
Results:
**********config*********************yoni*******************

Multi-tenancy with a separate database per customer, using Spring Data ArangoDB

So far, the only way I know to set the name of a database, to use with Spring Data ArangoDB, is by hardcoding it in a database() method while extending AbstractArangoConfiguration, like so:
#Configuration
#EnableArangoRepositories(basePackages = { "com.company.mypackage" })
public class MyConfiguration extends AbstractArangoConfiguration {
#Override
public ArangoDB.Builder arango() {
return new ArangoDB.Builder();
}
#Override
public String database() {
// Name of the database to be used
return "example-database";
}
}
What if I'd like to implement multi-tenancy, where each tenant has data in a separate database and use e.g. a subdomain to determine which database name should be used?
Can the database used by Spring Data ArangoDB be determined at runtime, dynamically?
This question is related to the discussion here: Manage multi-tenancy ArangoDB connection - but is Spring Data ArangoDB specific.
Turns out this is delightfully simple: Just change the ArangoConfiguration database() method #Override to return a Spring Expression (SpEL):
#Override
public String database() {
return "#{tenantProvider.getDatabaseName()}";
}
which in this example references a TenantProvider #Component which can be implemented like so:
#Component
public class TenantProvider {
private final ThreadLocal<String> databaseName;
public TenantProvider() {
super();
databaseName = new ThreadLocal<>();
}
public String getDatabaseName() {
return databaseName.get();
}
public void setDatabaseName(final String databaseName) {
this.databaseName.set(databaseName);
}
}
This component can then be #Autowired wherever in your code to set the database name, such as in a servlet filter, or in my case in an Apache Camel route Processor and in database service methods.
P.s. I became aware of this possibility by reading the ArangoTemplate code and a Spring Expression support documentation section
(via), and one merged pull request.

How to set tableName dynamically using environment variable in spring boot?

I am using AWS ECS to host my application and using DynamoDB for all database operations. So I'll have same database with different table names for different environments. Such as "dev_users" (for Dev env), "test_users" (for Test env), etc.. (This is how our company uses same Dynamo account for different environments)
So I would like to change the "tableName" of the model class using the environment variable passed through "AWS ECS task definition" environment parameters.
For Example.
My Model Class is:
#DynamoDBTable(tableName = "dev_users")
public class User {
Now I need to replace the "dev" with "test" when I deploy my container in test environment. I know I can use
#Value("${DOCKER_ENV:dev}")
to access environment variables. But I'm not sure how to use variables outside the class. Is there any way that I can use the docker env variable to select my table prefix?
My Intent is to use like this:
I know this not possible like this. But is there any other way or work around for this?
Edit 1:
I am working on the Rahul's answer and facing some issues. Before writing the issues, I'll explain the process I followed.
Process:
I have created the beans in my config class (com.myapp.users.config).
As I don't have repositories, I have given my Model class package name as "basePackage" path. (Please check the image)
For 1) I have replaced the "table name over-rider bean injection" to avoid the error.
For 2) I printed the name that is passing on to this method. But it is Null. So checking all the possible ways to pass the value here.
Check the image for error:
I haven't changed anything in my user model class as beans will replace the name of the DynamoDBTable when the beans got executed. But the table name over riding is happening. Data is pulling from the table name given at the Model Class level only.
What I am missing here?
The table names can be altered via an altered DynamoDBMapperConfig bean.
For your case where you have to Prefix each table with a literal, you can add the bean as such. Here the prefix can be the environment name in your case.
#Bean
public TableNameOverride tableNameOverrider() {
String prefix = ... // Use #Value to inject values via Spring or use any logic to define the table prefix
return TableNameOverride.withTableNamePrefix(prefix);
}
For more details check out the complete details here:
https://github.com/derjust/spring-data-dynamodb/wiki/Alter-table-name-during-runtime
I am able to achieve table names prefixed with active profile name.
First added TableNameResolver class as below,
#Component
public class TableNameResolver extends DynamoDBMapperConfig.DefaultTableNameResolver {
private String envProfile;
public TableNameResolver() {}
public TableNameResolver(String envProfile) {
this.envProfile=envProfile;
}
#Override
public String getTableName(Class<?> clazz, DynamoDBMapperConfig config) {
String stageName = envProfile.concat("_");
String rawTableName = super.getTableName(clazz, config);
return stageName.concat(rawTableName);
}
}
Then i setup DynamoDBMapper bean as below,
#Bean
#Primary
public DynamoDBMapper dynamoDBMapper(AmazonDynamoDB amazonDynamoDB) {
DynamoDBMapper mapper = new DynamoDBMapper(amazonDynamoDB,new DynamoDBMapperConfig.Builder().withTableNameResolver(new TableNameResolver(envProfile)).build());
return mapper;
}
Added variable envProfile which is an active profile property value accessed from application.properties file.
#Value("${spring.profiles.active}")
private String envProfile;
We have the same issue with regards to the need to change table names during runtime. We are using Spring-data-dynamodb 5.0.2 and the following configuration seems to provide the solutions that we need.
First I annotated my bean accessor
#EnableDynamoDBRepositories(dynamoDBMapperConfigRef = "getDynamoDBMapperConfig", basePackages = "my.company.base.package")
I also setup an environment variable called ENV_PREFIX which is Spring wired via SpEL.
#Value("#{systemProperties['ENV_PREFIX']}")
private String envPrefix;
Then I setup a TableNameOverride bean:
#Bean
public DynamoDBMapperConfig.TableNameOverride getTableNameOverride() {
return DynamoDBMapperConfig.TableNameOverride.withTableNamePrefix(envPrefix);
}
Finally, I setup the DynamoDBMapperConfig bean using TableNameOverride injection. In 5.0.2, we had to setup a standard DynamoDBTypeConverterFactory in the DynamoDBMapperConfig builder to avoid NPE.:
#Bean
public DynamoDBMapperConfig getDynamoDBMapperConfig(DynamoDBMapperConfig.TableNameOverride tableNameOverride) {
DynamoDBMapperConfig.Builder builder = new DynamoDBMapperConfig.Builder();
builder.setTableNameOverride(tableNameOverride);
builder.setTypeConverterFactory(DynamoDBTypeConverterFactory.standard());
return builder.build();
}
In hind sight, I could have setup a DynamoDBTypeConverterFactory bean that returns a standard DynamoDBTypeConverterFactory and inject that into the getDynamoDBMapperConfig() method using the DynamoDBMapperConfig builder. But this will also do the job.
I up voted the other answer but here is an idea:
Create a base class with all your user details:
#MappedSuperclass
public abstract class AbstractUser {
#Id
#GeneratedValue(strategy=GenerationType.AUTO)
private Long id;
private String firstName;
private String lastName;
Create 2 implentations with different table names and spirng profiles:
#Profile(value= {"dev","default"})
#Entity(name = "dev_user")
public class DevUser extends AbstractUser {
}
#Profile(value= {"prod"})
#Entity(name = "prod_user")
public class ProdUser extends AbstractUser {
}
Create a single JPA respository that uses the mapped super classs
public interface UserRepository extends CrudRepository<AbstractUser, Long> {
}
Then switch the implentation with the spring profile
#RunWith(SpringJUnit4ClassRunner.class)
#DataJpaTest
#Transactional
public class UserRepositoryTest {
#Autowired
protected DataSource dataSource;
#BeforeClass
public static void setUp() {
System.setProperty("spring.profiles.active", "prod");
}
#Test
public void test1() throws Exception {
DatabaseMetaData metaData = dataSource.getConnection().getMetaData();
ResultSet tables = metaData.getTables(null, null, "PROD_USER", new String[] { "TABLE" });
tables.next();
assertEquals("PROD_USER", tables.getString("TABLE_NAME"));
}
}

#ConditionalOnProperty for lists or arrays?

I'm using Spring Boot 1.4.3 #AutoConfiguration where I create beans automatically based on properties user specifies. User can specify an array of services, where name and version are required fields:
service[0].name=myServiceA
service[0].version=1.0
service[1].name=myServiceB
service[1].version=1.2
...
If the user forgets to specify a required field on even just one service, I want to back-off and not create any beans. Can I accomplish this with #ConditionalOnProperty? I want something like:
#Configuration
#ConditionalOnProperty({"service[i].name", "service[i].version"})
class AutoConfigureServices {
....
}
This is the custom Condition I created. It needs some polishing to be more generic (ie not hardcoding strings), but worked great for me.
To use, I annotated my Configuration class with #Conditional(RequiredRepeatablePropertiesCondition.class)
public class RequiredRepeatablePropertiesCondition extends SpringBootCondition {
private static final Logger LOGGER = LoggerFactory.getLogger(RequiredRepeatablePropertiesCondition.class.getName());
public static final String[] REQUIRED_KEYS = {
"my.services[i].version",
"my.services[i].name"
};
#Override
public ConditionOutcome getMatchOutcome(ConditionContext context, AnnotatedTypeMetadata metadata) {
List<String> missingProperties = new ArrayList<>();
RelaxedPropertyResolver resolver = new RelaxedPropertyResolver(context.getEnvironment());
Map<String, Object> services = resolver.getSubProperties("my.services");
if (services.size() == 0) {
missingProperties.addAll(Arrays.asList(REQUIRED_KEYS));
return getConditionOutcome(missingProperties);
}
//gather indexes to check: [0], [1], [3], etc
Pattern p = Pattern.compile("\\[(\\d+)\\]");
Set<String> uniqueIndexes = new HashSet<String>();
for (String key : services.keySet()) {
Matcher m = p.matcher(key);
if (m.find()) {
uniqueIndexes.add(m.group(1));
}
}
//loop each index and check required props
uniqueIndexes.forEach(index -> {
for (String genericKey : REQUIRED_KEYS) {
String multiServiceKey = genericKey.replace("[i]", "[" + index + "]");
if (!resolver.containsProperty(multiServiceKey)) {
missingProperties.add(multiServiceKey);
}
}
});
return getConditionOutcome(missingProperties);
}
private ConditionOutcome getConditionOutcome(List<String> missingProperties) {
if (missingProperties.isEmpty()) {
return ConditionOutcome.match(ConditionMessage.forCondition(RequiredRepeatablePropertiesCondition.class.getCanonicalName())
.found("property", "properties")
.items(Arrays.asList(REQUIRED_KEYS)));
}
return ConditionOutcome.noMatch(
ConditionMessage.forCondition(RequiredRepeatablePropertiesCondition.class.getCanonicalName())
.didNotFind("property", "properties")
.items(missingProperties)
);
}
}
Old question, but I hope my answer will help for Spring2.x:
Thanks to #Brian, I checked migration guide, where I was inspired by example code. This code works for me:
final List<String> services = Binder.get(context.getEnvironment()).bind("my.services", List.class).orElse(null);
I did try to get List of POJO (as AutoConfigureService) but my class differs from AutoConfigureServices. For that purpose, I used:
final Services services = Binder.get(context.getEnvironment()).bind("my.services", Services.class).orElse(null);
Well, keep playing :-D
Here's my take on this issue with the use of custom conditions in Spring autoconfiguration. Somewhat similar to what #Strumbels proposed but more reusable.
#Conditional annotations are executed very early in during the application startup. Properties sources are already loaded but ConfgurationProperties beans are not yet created. However we can work around that issue by binding properties to Java POJO ourselves.
First I introduce a functional interface which will enable us to define any custom logic checking if properties are in fact present or not. In your case this method will take care of checking if the property List is empty/null and if all items within are valid.
public interface OptionalProperties {
boolean isPresent();
}
Now let's create an annotation which will be metannotated with Spring #Conditional and allow us to define custom parameters. prefix represents the property namespace and targetClass represents the configuration properties model class to which properties should be mapped.
#Target({ElementType.TYPE, ElementType.METHOD})
#Retention(RetentionPolicy.RUNTIME)
#Documented
#Conditional(OnConfigurationPropertiesCondition.class)
public #interface ConditionalOnConfigurationProperties {
String prefix();
Class<? extends OptionalProperties> targetClass();
}
And now the main part. The custom condition implementation.
public class OnConfigurationPropertiesCondition extends SpringBootCondition {
#Override
public ConditionOutcome getMatchOutcome(ConditionContext context, AnnotatedTypeMetadata metadata) {
MergedAnnotation<ConditionalOnConfigurationProperties> mergedAnnotation = metadata.getAnnotations().get(ConditionalOnConfigurationProperties.class);
String prefix = mergedAnnotation.getString("prefix");
Class<?> targetClass = mergedAnnotation.getClass("targetClass");
// type precondition
if (!OptionalProperties.class.isAssignableFrom(targetClass)) {
return ConditionOutcome.noMatch("Target type does not implement the OptionalProperties interface.");
}
// the crux of this solution, binding properties to Java POJO
Object bean = Binder.get(context.getEnvironment()).bind(prefix, targetClass).orElse(null);
// if properties are not present at all return no match
if (bean == null) {
return ConditionOutcome.noMatch("Binding properties to target type resulted in null value.");
}
OptionalProperties props = (OptionalProperties) bean;
// execute method from OptionalProperties interface
// to check if condition should be matched or not
// can include any custom logic using property values in a type safe manner
if (props.isPresent()) {
return ConditionOutcome.match();
} else {
return ConditionOutcome.noMatch("Properties are not present.");
}
}
}
Now you should create your own configuration properties class implementing OptionalProperties interface.
#ConfigurationProperties("your.property.prefix")
#ConstructorBinding
public class YourConfigurationProperties implements OptionalProperties {
// Service is your POJO representing the name and version subproperties
private final List<Service> services;
#Override
public boolean isPresent() {
return services != null && services.stream().all(Service::isValid);
}
}
And then in Spring #Configuration class.
#Configuration
#ConditionalOnConfigurationProperties(prefix = "", targetClass = YourConfigurationProperties.class)
class AutoConfigureServices {
....
}
There are two downsides to this solution:
Property prefix must be specified in two locations: on #ConfigurationProperties annotation and on #ConditionalOnConfigurationProperties annotation. This can partially be alleviated by defining a public static final String PREFIX = "namespace" in your configuration properties POJO.
Property binding process is executed separately for each use of our custom conditional annotation and then once again to create the configuration properties bean itself. It happens only during app startup so it shouldn't be an issue but it still is an inefficiency.
You can leverage the org.springframework.boot.autoconfigure.condition.OnPropertyListCondition class. For example, given you want to check for the service property having at least one value:
class MyListCondition extends OnPropertyListCondition {
MyListCondition() {
super("service", () -> ConditionMessage.forCondition("service"));
}
}
#Configuration
#Condition(MyListCondition.class)
class AutoConfigureServices {
}
See the org.springframework.boot.autoconfigure.webservices.OnWsdlLocationsCondition used on org.springframework.boot.autoconfigure.webservices.WebServicesAutoConfiguration#wsdlDefinitionBeanFactoryPostProcessor for an example within Spring itself.

How to use spring data with couchbase without _class attribute

Is there a simple way to use spring data couchbase with documents that do not have _class attribute?
In the couchbase I have something like this in my sampledata bucket:
{
"username" : "alice",
"created" : 1473292800000,
"data" : { "a": 1, "b" : "2"},
"type" : "mydata"
}
Now, is there any way to define mapping from this structure of document to Java object (note that _class attribute is missing and cannot be added) and vice versa so that I get all (or most) automagical features from spring couchbase data?
Something like:
If type field has value "mydata" use class MyData.java.
So when find is performed instead of automatically adding AND _class = "mydata" to generated query add AND type = "mydata".
Spring Data in general needs the _class field to know what to instantiate back when deserializing.
It's fairly easy in Spring Data Couchbase to use a different field name than _class, by overriding the typeKey() method in the AbsctractCouchbaseDataConfiguration.
But it'll still expect a fully qualified classname in there by default
Getting around that will require quite a bit more work:
You'll need to implement your own CouchbaseTypeMapper, following the model of DefaultCouchbaseTypeMapper. In the super(...) constructor, you'll need to provide an additional argument: a list of TypeInformationMapper. The default implementation doesn't explicitly provide one, so a SimpleTypeInformationMapper is used, which is the one that puts FQNs.
There's an alternative implementation that is configurable so you can alias specific classes to a shorter name via a Map: ConfigurableTypeInformationMapper...
So by putting a ConfigurableTypeInformationMapper with the alias you want for specific classes + a SimpleTypeInformationMapper after it in the list (for the case were you serialize a class that you didn't provide an alias for), you can achieve your goal.
The typeMapper is used within the MappingCouchbaseConverter, which you'll also need to extend unfortunately (just to instantiate your typeMapper instead of the default.
Once you have that, again override the configuration to return an instance of your custom MappingCouchbaseConverter that uses your custom CouchbaseTypeMapper (the mappingCouchbaseConverter() method).
You can achive this e.g. by creating custom annotation #DocumentType
#DocumentType("billing")
#Document
public class BillingRecordDocument {
String name;
// ...
}
Document will look like:
{
"type" : "billing"
"name" : "..."
}
Just create following classes:
Create custom AbstractReactiveCouchbaseConfiguration or AbstractCouchbaseConfiguration (depends which varian you use)
#Configuration
#EnableReactiveCouchbaseRepositories
public class CustomReactiveCouchbaseConfiguration extends AbstractReactiveCouchbaseConfiguration {
// implement abstract methods
// and configure custom mapping convereter
#Bean(name = BeanNames.COUCHBASE_MAPPING_CONVERTER)
public MappingCouchbaseConverter mappingCouchbaseConverter() throws Exception {
MappingCouchbaseConverter converter = new CustomMappingCouchbaseConverter(couchbaseMappingContext(), typeKey());
converter.setCustomConversions(customConversions());
return converter;
}
#Override
public String typeKey() {
return "type"; // this will owerride '_class'
}
}
Create custom MappingCouchbaseConverter
public class CustomMappingCouchbaseConverter extends MappingCouchbaseConverter {
public CustomMappingCouchbaseConverter(final MappingContext<? extends CouchbasePersistentEntity<?>,
CouchbasePersistentProperty> mappingContext, final String typeKey) {
super(mappingContext, typeKey);
this.typeMapper = new TypeBasedCouchbaseTypeMapper(typeKey);
}
}
and custom annotation #DocumentType
#Persistent
#Inherited
#Retention(RetentionPolicy.RUNTIME)
#Target({ElementType.TYPE})
public #interface DocumentType {
String value();
}
Then create TypeAwareTypeInformationMapper which will just check if an entity is annoatated by #DocumentType if so, use value from that annotation, do the default if not (fully qualified class name)
public class TypeAwareTypeInformationMapper extends SimpleTypeInformationMapper {
#Override
public Alias createAliasFor(TypeInformation<?> type) {
DocumentType[] documentType = type.getType().getAnnotationsByType(DocumentType.class);
if (documentType.length == 1) {
return Alias.of(documentType[0].value());
}
return super.createAliasFor(type);
}
}
Then register it as following
public class TypeBasedCouchbaseTypeMapper extends DefaultTypeMapper<CouchbaseDocument> implements CouchbaseTypeMapper {
private final String typeKey;
public TypeBasedCouchbaseTypeMapper(final String typeKey) {
super(new DefaultCouchbaseTypeMapper.CouchbaseDocumentTypeAliasAccessor(typeKey),
Collections.singletonList(new TypeAwareTypeInformationMapper()));
this.typeKey = typeKey;
}
#Override
public String getTypeKey() {
return typeKey;
}
}
In your couchbase configuration class you just need to have :
#Override
public String typeKey() {
return "type";
}
Unfortunately for query derivation (n1ql) the _class or type are still using the class name.Tried spring couch 2.2.6 and it's minus point here.
#Simon, are you aware that something has changed and the support to have the possibility to have custom _class/type value in next release(s)?
#SimonBasle
Inside of class N1qlUtils and method createWhereFilterForEntity we have access to the CouchbaseConverter. On line:
String typeValue = entityInformation.getJavaType().getName();
Why not use the typeMapper from the converter to get the name of the entity when we want to avoid using the class name? Otherwise you have to annotate each method in your repository as follows:
#Query("#{#n1ql.selectEntity} WHERE `type`='airport' AND airportname = $1")
List<Airport> findAirportByAirportname(String airportName);
If createWhereFilterForEntity used the CouchbaseConverter we could avoid annotating with the #Query.

Categories