My application expects to find a configuration file called MyPojo.json, loaded into MyPojo class by MyService class:
#Data // (Lombok's) getters and setters
public class MyPojo {
int foo = 42;
int bar = 1337;
}
It's not a problem if it doesn't exist: in that case, the application will create it with default values.
The path where to read/write MyPojo.json is stored in /src/main/resources/settings.properties:
the.path=cfg/MyPojo.json
which is passed to MyService through Spring's #PropertySource as follows:
#Configuration
#PropertySource("classpath:settings.properties")
public class MyService {
#Inject
Environment settings; // "src/main/resources/settings.properties"
#Bean
public MyPojo load() throws Exception {
MyPojo pojo = null;
// "cfg/MyPojo.json"
Path path = Paths.get(settings.getProperty("the.path"));
if (Files.exists(confFile)){
pojo = new ObjectMapper().readValue(path.toFile(), MyPojo.class);
} else { // JSON file is missing, I create it.
pojo = new MyPojo();
Files.createDirectory(path.getParent()); // create "cfg/"
new ObjectMapper().writeValue(path.toFile(), pojo); // create "cfg/MyPojo.json"
}
return pojo;
}
}
Since MyPojo's path is relative, when I run this from a Unit Test
#Test
public void testCanRunMockProcesses() {
try (AnnotationConfigApplicationContext ctx =
new AnnotationConfigApplicationContext(MyService.class)){
MyPojo pojo = ctx.getBean(MyPojo.class);
String foo = pojo.getFoo();
...
// do assertion
}
}
the cfg/MyPojo.json is created under the root of my project, which is definitely not what I want.
I would like MyPojo.json to be created under my target folder, eg. /build in Gradle projects, or /target in Maven projects.
To do that, I've created a secondary settings.properties under src/test/resources, containing
the.path=build/cfg/MyPojo.json
and tried to feed it to MyService in several ways, without success.
Even if called by the test case, MyService is always reading src/main/resources/settings.properties instead of src/test/resources/settings.properties.
With two log4j2.xml resources instead (src/main/resources/log4j2.xml and src/test/resources/log4j2-test.xml), it worked :/
Can I do the same with a property file injected by Spring with #PropertySource ?
You can use #TestPropertySource annotation for this.
Example:
For single property:
#TestPropertySource(properties = "property.name=value")
For property file
#TestPropertySource(
locations = "classpath:yourproperty.properties")
So, you provide path for MyPojo.json like
#TestPropertySource(properties = "path=build/cfg/MyPojo.json")
Related
In a multi-tenant Spring Boot application, I'm trying to load configuration objects. Ideally, I'd like to load certain properties file into a configuration object programmatically. I'm looking for a simple way to load the configuration by passing a properties file and the final class to map it to. The following is just an example of what I'm trying to achieve.
Directory structure of the configurations:
config/
- common.properties
all_tenants_config/
- foo_tenant/
- database.properties
- api.properties
- bar_tenant/
- database.properties
- api.properties
Configuration POJOs:
class DatabaseProperties {
#Value("${database.url}")
private String url;
}
class APIProperties {
#Value("${api.endPoint}")
private String endPoint;
}
Configuration Provider:
#Singleton
class ConfigurationProvider {
private Map<String, DatabaseProperties> DB_PROPERTIES = new HashMap<>();
private Map<String, APIProperties> API_PROPERTIES = new HashMap<>();
public ConfigurationProvider(#Value(${"tenantsConfigPath"}) String tenantsConfigPath) {
for (File tenant : Path.of(tenantsConfigPath).toFile().listFiles()) {
String tenantName = tenant.getName();
for (File configFile : tenant.listFiles()) {
String configName = configFile.getName();
if ("database.properties".equals(configName)) {
// This is what I'm looking for. An easy way to load the configuration by passing a properties file and the final class to map it to.
DB_PROPERTIES.put(tenant, SPRING_CONFIG_LOADER.load(configFile, DatabaseProperties.class));
} else if ("api.properties".equals(configName)) {
API_PROPERTIES.put(tenant, SPRING_CONFIG_LOADER.load(configFile, API.class));
}
}
}
}
public currentTenantDBProperties() {
return DB_PROPERTIES.get(CURRENT_TENANT_ID);
}
public currentTenantAPIProperties() {
return API_PROPERTIES.get(CURRENT_TENANT_ID);
}
}
In short, is there a way in Spring that allows to map a properties file to an object without using the default Spring's configuration annotations.
Well, in this case you do not need any Spring's feature.
Spring is a bean container, but in this place you just new an object by yourself and put it on your map cache.
Step 1: decode property file to Java Properties Class Object
Step 2: turn your properties object to your target object, just use some utils like objectmapper
FileReader reader = new FileReader("db.properties"); // replace file name with your variable
Properties p = new Properties();
p.load(reader);
ObjectMapper mapper = new ObjectMapper();
DatabaseProperties databaseProperties = mapper.convertValue(p,
DatabaseProperties.class);
I know how I can access the application.properties values in #Service classes in Java Spring boot like below
#Service
public class AmazonClient {
#Value("${cloud.aws.endpointUrl}")
private String endpointUrl;
}
But I am looking for an option to access this value directly in any class (a class without #Service annotation)
e.g.
public class AppUtils {
#Value("${cloud.aws.endpointUrl}")
private String endpointUrl;
}
But this returns null. Any help would be appreciated.
I have already read here but didn't help.
There's no "magic" way to inject values from a property file into a class that isn't a bean. You can define a static java.util.Properties field in the class, load values from the file manually when the class is loading and then work with this field:
public final class AppUtils {
private static final Properties properties;
static {
properties = new Properties();
try {
ClassLoader classLoader = AppUtils.class.getClassLoader();
InputStream applicationPropertiesStream = classLoader.getResourceAsStream("application.properties");
applicationProperties.load(applicationPropertiesStream);
} catch (Exception e) {
// process the exception
}
}
}
You can easily achievw this by annotating ur app utils class with #component annotation . spring will take care of loading properties.
But if you don't want to do that approach , then look at the link below .
https://www.baeldung.com/inject-properties-value-non-spring-class
I have a springboot commandline app where one of the production commandline args is the absolute base path. For this example we will call it
"/var/batch/"
I'm setting the basepath in my production.yml file like so with a default value.
company:
basePath: ${basePath:/var/default/}
I then have an ApplicationConfig.java file that uses that base path to create a bunch of file paths like so.
#ConfigurationProperties(prefix = "company")
public class ApplicationConfig {
private String basePath;
public String getPrimaryCarePath() {
return basePath + "ADAP-2-PCProv.dat";
}
public String getPrimaryCareDetailPath() {
return basePath + "ADAP-2-" + getBatchNo() + ".det";
}
... additional files.
}
Lastly the file paths get passed into my css parser like so.
public List<T> readCsv() throws IOException {
try (BufferedReader bufferedReader = Files.newBufferedReader(Paths.get(filePath))) {
return new CsvToBeanBuilder(bufferedReader)
.withFieldAsNull(CSVReaderNullFieldIndicator.EMPTY_SEPARATORS)
.withType(subClass)
.withSeparator('\t')
.withIgnoreLeadingWhiteSpace(true)
.build().parse();
}
}
Now everything works fine in production, but we face some issues when attempting to run mutation test. It appears as if the csv parser is looking for an absolute path rather than a relative path. We have the following path in our application-test.yml file.
company:
basePath: src/test/resources/
All our test files are stored in the test resources package, so my question is how can we use a relative path to the test resources populating the ApplicationConfig.java file while still being able to use an absolute path for production? I was thinking I could override the basepath with the test setup using ClassPathResource, but was wondering if there was a better approach.
You need 2 types of configurations : one for resources and one for absolute path.
I would suggest to add a new property app.file.path.type with values resources and absolute. You can define a new interface named FileProvider.
public interface FilePathProvider(){
Path getFilePath();
}
You can define 2 different beans with #ConditionalOnProperty and set the file path strategy:
#Configuration
public class ApplicationConfig{
#Bean
#ConditionalOnProperty(
name = "app.file.path.type",
havingValue = "absolute")
public FilePathProvider absoluteFilePathProvider(ApplicationConfig applicationConfig){
return () -> Paths.get(applicationConfig.getBasePath());
}
#ConditionalOnProperty(
name = "app.file.path.type",
havingValue = "resources")
#Bean
public FilePathProvider resourceFilePathProvider(ApplicationConfig applicationConfig){
return () -> Paths.get(this.getClass().getClassLoader().getResource(applicationConfig.getBasePath()).getPath());
}
}
In development and test mode, you will have app.file.path.type=resources and in production you will have app.file.path.type=absolute.
The advantage of this approach is that you can set the property to absolute in development as well.
I am using AWS ECS to host my application and using DynamoDB for all database operations. So I'll have same database with different table names for different environments. Such as "dev_users" (for Dev env), "test_users" (for Test env), etc.. (This is how our company uses same Dynamo account for different environments)
So I would like to change the "tableName" of the model class using the environment variable passed through "AWS ECS task definition" environment parameters.
For Example.
My Model Class is:
#DynamoDBTable(tableName = "dev_users")
public class User {
Now I need to replace the "dev" with "test" when I deploy my container in test environment. I know I can use
#Value("${DOCKER_ENV:dev}")
to access environment variables. But I'm not sure how to use variables outside the class. Is there any way that I can use the docker env variable to select my table prefix?
My Intent is to use like this:
I know this not possible like this. But is there any other way or work around for this?
Edit 1:
I am working on the Rahul's answer and facing some issues. Before writing the issues, I'll explain the process I followed.
Process:
I have created the beans in my config class (com.myapp.users.config).
As I don't have repositories, I have given my Model class package name as "basePackage" path. (Please check the image)
For 1) I have replaced the "table name over-rider bean injection" to avoid the error.
For 2) I printed the name that is passing on to this method. But it is Null. So checking all the possible ways to pass the value here.
Check the image for error:
I haven't changed anything in my user model class as beans will replace the name of the DynamoDBTable when the beans got executed. But the table name over riding is happening. Data is pulling from the table name given at the Model Class level only.
What I am missing here?
The table names can be altered via an altered DynamoDBMapperConfig bean.
For your case where you have to Prefix each table with a literal, you can add the bean as such. Here the prefix can be the environment name in your case.
#Bean
public TableNameOverride tableNameOverrider() {
String prefix = ... // Use #Value to inject values via Spring or use any logic to define the table prefix
return TableNameOverride.withTableNamePrefix(prefix);
}
For more details check out the complete details here:
https://github.com/derjust/spring-data-dynamodb/wiki/Alter-table-name-during-runtime
I am able to achieve table names prefixed with active profile name.
First added TableNameResolver class as below,
#Component
public class TableNameResolver extends DynamoDBMapperConfig.DefaultTableNameResolver {
private String envProfile;
public TableNameResolver() {}
public TableNameResolver(String envProfile) {
this.envProfile=envProfile;
}
#Override
public String getTableName(Class<?> clazz, DynamoDBMapperConfig config) {
String stageName = envProfile.concat("_");
String rawTableName = super.getTableName(clazz, config);
return stageName.concat(rawTableName);
}
}
Then i setup DynamoDBMapper bean as below,
#Bean
#Primary
public DynamoDBMapper dynamoDBMapper(AmazonDynamoDB amazonDynamoDB) {
DynamoDBMapper mapper = new DynamoDBMapper(amazonDynamoDB,new DynamoDBMapperConfig.Builder().withTableNameResolver(new TableNameResolver(envProfile)).build());
return mapper;
}
Added variable envProfile which is an active profile property value accessed from application.properties file.
#Value("${spring.profiles.active}")
private String envProfile;
We have the same issue with regards to the need to change table names during runtime. We are using Spring-data-dynamodb 5.0.2 and the following configuration seems to provide the solutions that we need.
First I annotated my bean accessor
#EnableDynamoDBRepositories(dynamoDBMapperConfigRef = "getDynamoDBMapperConfig", basePackages = "my.company.base.package")
I also setup an environment variable called ENV_PREFIX which is Spring wired via SpEL.
#Value("#{systemProperties['ENV_PREFIX']}")
private String envPrefix;
Then I setup a TableNameOverride bean:
#Bean
public DynamoDBMapperConfig.TableNameOverride getTableNameOverride() {
return DynamoDBMapperConfig.TableNameOverride.withTableNamePrefix(envPrefix);
}
Finally, I setup the DynamoDBMapperConfig bean using TableNameOverride injection. In 5.0.2, we had to setup a standard DynamoDBTypeConverterFactory in the DynamoDBMapperConfig builder to avoid NPE.:
#Bean
public DynamoDBMapperConfig getDynamoDBMapperConfig(DynamoDBMapperConfig.TableNameOverride tableNameOverride) {
DynamoDBMapperConfig.Builder builder = new DynamoDBMapperConfig.Builder();
builder.setTableNameOverride(tableNameOverride);
builder.setTypeConverterFactory(DynamoDBTypeConverterFactory.standard());
return builder.build();
}
In hind sight, I could have setup a DynamoDBTypeConverterFactory bean that returns a standard DynamoDBTypeConverterFactory and inject that into the getDynamoDBMapperConfig() method using the DynamoDBMapperConfig builder. But this will also do the job.
I up voted the other answer but here is an idea:
Create a base class with all your user details:
#MappedSuperclass
public abstract class AbstractUser {
#Id
#GeneratedValue(strategy=GenerationType.AUTO)
private Long id;
private String firstName;
private String lastName;
Create 2 implentations with different table names and spirng profiles:
#Profile(value= {"dev","default"})
#Entity(name = "dev_user")
public class DevUser extends AbstractUser {
}
#Profile(value= {"prod"})
#Entity(name = "prod_user")
public class ProdUser extends AbstractUser {
}
Create a single JPA respository that uses the mapped super classs
public interface UserRepository extends CrudRepository<AbstractUser, Long> {
}
Then switch the implentation with the spring profile
#RunWith(SpringJUnit4ClassRunner.class)
#DataJpaTest
#Transactional
public class UserRepositoryTest {
#Autowired
protected DataSource dataSource;
#BeforeClass
public static void setUp() {
System.setProperty("spring.profiles.active", "prod");
}
#Test
public void test1() throws Exception {
DatabaseMetaData metaData = dataSource.getConnection().getMetaData();
ResultSet tables = metaData.getTables(null, null, "PROD_USER", new String[] { "TABLE" });
tables.next();
assertEquals("PROD_USER", tables.getString("TABLE_NAME"));
}
}
I've a spring bean which loads the property file depending upon their availability as shown below:-
#PropertySources({ #PropertySource(value = "classpath:user.properties"),
#PropertySource(value = "file:./config/user.properties", ignoreResourceNotFound = true) })
The property file is getting loaded, but when I try to read entire property file in one go via :-
Properties properties = PropertiesLoaderUtils.loadAllProperties("user.properties");
then I only get the properties from classpath. Do spring provide any mechanism to read all properties in one go?
That code of yours doesn't do what the annotations do. You have a couple of annotations that declare what to do. That logic isn't present at all in the code snippet.
There's no magic, if you want the same result, you need to translate the declarative aspects of those annotations in code (i.e. reading the classpath file then the file one and check if it exists and then merge those properties).
If you're ok to get extra keys, you could also simply inject the Environment as #PropertySource is going to update that.
Answering my own question, may be this may help someone.
Since I need to override the properties file contained in jar with external properties file (if present in specified folder) also I need to read entire property file in one go.
I've leveraged the spring behavior of loading last property read.
#PropertySources({ #PropertySource(value = "classpath:application.properties"),
#PropertySource(value = "file:./config/application.properties", ignoreResourceNotFound = true) })
Now if application.properties is present in ./config/ location then it'll override application.properties from classpath.
In main application.properties I've defined from where the external properties should get loaded i.e.
config.location=./config/
./config/ attribute can be overridden in case of production and test environment.
After this I've defined a bean to load all properties files (import statement skipped):-
#Component
public class PropertiesConfig {
private final Logger logger = LoggerFactory.getLogger(PropertiesConfig.class);
private final String[] PROPERTIES_FILENAMES = { "prop1.properties", "prop2.properties",
"prop3.properties" };
private String configLocation;
private Map<String, Properties> configProperties;
#Autowired
public PropertiesConfig(#Value("${config.location}") String configLocation) {
this.configLocation = configLocation;
configProperties = Arrays.stream(PROPERTIES_FILENAMES)
.collect(Collectors.toMap(filename -> filename, this::loadProperties));
}
public Properties getProperties(String fileName) {
if (StringUtils.isEmpty(fileName) || !configProperties.containsKey(fileName)) {
logger.info(String.format("Invalid property name : %s", fileName));
throw new IllegalArgumentException(
String.format("Invalid property name : %s", fileName));
}
return configProperties.get(fileName);
}
private Properties loadProperties(final String filename) {
final Resource[] possiblePropertiesResources = { new ClassPathResource(filename),
new PathResource(getCustomPath(filename)) };
final Resource resource = Arrays.stream(possiblePropertiesResources)
.filter(Resource::exists).reduce((previous, current) -> current).get();
final Properties properties = new Properties();
try {
properties.load(resource.getInputStream());
} catch (final IOException exception) {
throw new RuntimeException(exception);
}
logger.info("Using {} as user resource", resource);
return properties;
}
private String getCustomPath(final String filename) {
return configLocation.endsWith(".properties") ? configLocation : configLocation + filename;
}
}
Now you have a bean containing all the properties file in map which can be injected in any bean and can be overridden for any environment.