Auto-Generation of Tables - java

I'm trying to Spring Data Cassandra 1.1.2 to work with Cassandra 2.1.2 and Spring 4.0.2. Java is 1.7
Everything works fine - as far as I have tested - except for the tables/columnfamily not being created automatically. I have tried to enable it with:
session.setSchemaAction(SchemaAction.RECREATE_DROP_UNUSED);
but it doesn't even try to create the tables. At least, with ALL logging enabled, I can't see anything.
I have tried to find some sample code but nothing worked. Any pointers or sample code would be very welcome.

make your cassandra configuration class extend AbstractCassandraConfiguration
and override
#Override
#Bean
public CassandraSessionFactoryBean session() throws Exception {
CassandraSessionFactoryBean bean = new CassandraSessionFactoryBean();
bean.setCluster(cluster().getObject());
bean.setConverter(cassandraConverter());
bean.setSchemaAction(getSchemaAction());
bean.setKeyspaceName(getKeyspaceName());
return bean;
}
#Override
public SchemaAction getSchemaAction() {
return SchemaAction.RECREATE_DROP_UNUSED;
}
#Override
public String[] getEntityBasePackages() {
return new String[] {"com.example"}; //com.example package contains the bean with #table annotation
}

Related

Problem with setting tableName dynamically with Spring Boot and DynamoDB using older version of spring-data-dynamodb

I am working on some legacy project, which uses 1.5.15.RELEASE Spring Boot version. Due to this I am using spring-data-dynamodb version 4.5.7. My problem is kind of sibling to this.
I tried the solutions provided in that topic - including setting TableNameOverride and also custom TableNameResolver. Unfortunately all the findBy methods handled automatically by spring-data-dynamodb:4.5.7 are querying static table name - let's call it Item.
I also have some custom methods implemented in RepositoryImpl class, where I #Autowire bean of DynamoDBMapper. Suprisingly, queries implemented manually (not handled by spring-data-dynamodb:4.5.7) are querying table with correct prefix i.e. TEST_Item.
ItemsRepository (this methods do not apply prefix, when querying - does not work, still queries Item table):
public interface ItemRepository extends CrudRepository<Item, ItemId>, ItemCustomRepository {
Iterable<Item> findAllByInstanceId(Long instanceId);
Iterable<Item> findAllByInstanceIdOrderByInsertedDateDesc(Long instanceId);
}
ItemsRepositoryImpl (this method do apply prefix, when querying - works fine, queries TEST_Item):
public class ItemRepositoryImpl implements ItemCustomRepository {
#Autowired
private DynamoDBMapper dynamoDBMapper;
#Override
public List<Item> findByInstanceIdAndMessageLike(Long instanceId, String search, String stepCode) {
DynamoDBQueryExpression<Item> queryExpression = new DynamoDBQueryExpression<Item>()
//... here building some query - not important;
PaginatedQueryList<Item> queryList = dynamoDBMapper.query(Item.class, queryExpression);
return response;
}
Configuration class:
#Configuration
#EnableDynamoDBRepositories(amazonDynamoDBRef = "amazonDynamoDB", dynamoDBMapperConfigRef = "dynamoDBMapperConfig", basePackages = "com.shem.datasource.nosql.repository")
public class DynamoDBConfig {
#Value("${amazon.dynamodb.region}")
private String amazonDynamoDBRegion;
#Value("${dynamo.proxy.host:}")
private String proxyHost;
#Value("${dynamo.proxy.port:80}")
private Integer proxyPort;
#Value("${dynamo.proxy.enabled:false}")
private Boolean proxyEnabled;
#Value("${amazon.dynamodb.tableNamePrefix:}")
private String tableNamePrefix;
#Bean
public AmazonDynamoDB amazonDynamoDB() {
ClientConfiguration clientConfig = new ClientConfiguration();
if (proxyEnabled) {
clientConfig.setProxyHost(proxyHost);
clientConfig.setProxyPort(proxyPort);
clientConfig.setProxyProtocol(Protocol.HTTP);
}
AmazonDynamoDB amazonDynamoDB = AmazonDynamoDBClientBuilder.standard()
.withClientConfiguration(clientConfig)
.withRegion(amazonDynamoDBRegion)
.build();
return amazonDynamoDB;
}
#Bean
public DynamoDB dynamoDB(AmazonDynamoDB amazonDynamoDB) {
return new DynamoDB(amazonDynamoDB);
}
#Bean
public DynamoDBMapperConfig dynamoDBMapperConfig() {
DynamoDBMapperConfig.Builder builder = new DynamoDBMapperConfig.Builder();
String finalPrefix = StringUtils.isNotBlank(tableNamePrefix) ? tableNamePrefix + "_" : "";
builder.setPaginationLoadingStrategy(PaginationLoadingStrategy.ITERATION_ONLY);
builder.setTableNameOverride(DynamoDBMapperConfig.TableNameOverride.withTableNamePrefix(finalPrefix));
System.out.println("Prefix set to " + finalPrefix);
return builder.build();
}
#Bean
public DynamoDBMapper dynamoDBMapper(AmazonDynamoDB amazonDynamoDB, DynamoDBMapperConfig dynamoDBMapperConfig) {
return new DynamoDBMapper(amazonDynamoDB, dynamoDBMapperConfig);
}
}
Seems like the problem is somewhere in spring-data-dynamodb:4.5.7, but maybe someone already faced such issue and can help. Any thoughts how to overcome this issue?
The only solutions coming to my mind are from category not the best ideas:
remove spring-data-dynamodb at all and implement everything with just AWS Java SDK
try to change SpringBoot version to >2 and then try using newer version of spring-data-dynamodb, but project glows red when trying - timeconsuming..
I recommend using the Amazon DynamoDB Java API V2. You can develop Spring Boot apps that are able to use the latest features of this API, such as the enhanced client, which maps fields in a class to items in a table.
Here is a developer article that walks you through step by step how to develop a Spring Boot app that uses DynamoDB Java API V2.
Creating the DynamoDB web application item tracker

Unable to resolve variable from properties file when tried to access as function parameter using #Value annotation

This may be silly question to ask but i'm unable to find any satisfactory solution to my problem. In java we don't have the concept of default variables so i am trying to give default value from properties file to my function parameters/arguments using #Value annotation, but i'm always getting null and i'm unable to figure why is this happening. Please help me to solve the issue or provide me some appropriate link/reference which may solve my issue.
MainApplication.java
#SpringBootApplication
public class Application
{
public static void main(String[] args)
{
ApplicationContext context = SpringApplication.run(NetappApplication.class, args);
Sample sample = context.getBean(Sample.class);
System.out.println(sample.check(null));
}
}
Sample.java
public interface Sample
{
public String check(String message);
}
SampleImpl.java
#Service
#PropertySource("classpath:app.properties")
public class SampleImpl implements Sample
{
#Value("${test}")
String message1;
#Override
public String check(#Value("${test}") String message)
{
return message;
}
}
app.properties
test=anand
But you are passing null to your method...
Perhaps what you want to do is to assign default value to test in case it's not defined in property file:
#Value("${test:default}");
Then, when properties are autowired by Spring if placeholder resolver doesn't get the value from props file, it will use what is after :.
The best use case for this (that I can think of) is when you create Spring configuration.
Let's say you have a configuration class: for DB access. Simply put:
#Configuration
public class DbConfig {
#Value("${url:localhost}")
String dbUrl;
// rest for driver, user, pass etc
public DataSource createDatasource() {
// here you use some DataSourceBuilder to configure connection
}
}
Now, when Spring application starts up, properties' values are resolved, and as I wrote above you can switch between value from property and a default value. But it is done once, when app starts and Spring creates your beans.
If you want to check incoming argument on runtime, simple null check will be enough.
#Value("${test}")
String message1;
#Override
public String check(String message) {
if (message == null) {
return message1;
}
}

How to cache data during application startup in Spring boot application

I have a Spring boot Application connecting to SQL Server Database. I need some help in using caching in my application. I have a table for CodeCategory which has a list of codes for Many codes. This table will be loaded every month and data changes only once in a month.
I want to cache this entire table when the Application starts. In any subsequent calls to the table should get value from this cache instead of calling the Database.
For Example,
List<CodeCategory> findAll();
I want to cache the above DB query value during application startup. If there is a DB call like List<CodeCategory> findByCodeValue(String code) should fetch the code result from the already Cached data instead of calling the Database.
Please let me know how this can be achieved using spring boot and ehcache.
As pointed out, It takes some time for ehcache to setup and it is not working completely with #PostConstruct. In that case make use of ApplicationStartedEvent to load the cache.
GitHub Repo: spring-ehcache-demo
#Service
class CodeCategoryService{
#EventListener(classes = ApplicationStartedEvent.class )
public void listenToStart(ApplicationStartedEvent event) {
this.repo.findByCodeValue("100");
}
}
interface CodeCategoryRepository extends JpaRepository<CodeCategory, Long>{
#Cacheable(value = "codeValues")
List<CodeCategory> findByCodeValue(String code);
}
Note: There are multiple ways as pointed by others. You can choose as per your needs.
My way is to define a generic cache handler
#FunctionalInterface
public interface GenericCacheHandler {
List<CodeCategory> findAll();
}
And its implementation as below
#Component
#EnableScheduling // Important
public class GenericCacheHandlerImpl implements GenericCacheHandler {
#Autowired
private CodeRepository codeRepo;
private List<CodeCategory> codes = new ArrayList<>();
#PostConstruct
private void intializeBudgetState() {
List<CodeCategory> codeList = codeRepo.findAll();
// Any customization goes here
codes = codeList;
}
#Override
public List<CodeCategory> getCodes() {
return codes;
}
}
Call it in Service layer as below
#Service
public class CodeServiceImpl implements CodeService {
#Autowired
private GenericCacheHandler genericCacheHandler;
#Override
public CodeDTO anyMethod() {
return genericCacheHandler.getCodes();
}
}
Use the second level hibernate caching to cache all the required db queries.
For caching at the application start-up, we can use #PostContruct in any of the Service class.
Syntax will be :-
#Service
public class anyService{
#PostConstruct
public void init(){
//call any method
}
}
Use CommandLineRunner interface.
Basically , you can create a Spring #Component and implement CommandLineRunner interface. You will have to override it's run method. The run method will be called at the start of the app.
#Component
public class DatabaseLoader implements
CommandLineRunner {
#override
Public void run(.... string){
// Any code here gets called at the start of the app.
}}
This approach is mostly used to bootstrap the application with some initial data.

How to set tableName dynamically using environment variable in spring boot?

I am using AWS ECS to host my application and using DynamoDB for all database operations. So I'll have same database with different table names for different environments. Such as "dev_users" (for Dev env), "test_users" (for Test env), etc.. (This is how our company uses same Dynamo account for different environments)
So I would like to change the "tableName" of the model class using the environment variable passed through "AWS ECS task definition" environment parameters.
For Example.
My Model Class is:
#DynamoDBTable(tableName = "dev_users")
public class User {
Now I need to replace the "dev" with "test" when I deploy my container in test environment. I know I can use
#Value("${DOCKER_ENV:dev}")
to access environment variables. But I'm not sure how to use variables outside the class. Is there any way that I can use the docker env variable to select my table prefix?
My Intent is to use like this:
I know this not possible like this. But is there any other way or work around for this?
Edit 1:
I am working on the Rahul's answer and facing some issues. Before writing the issues, I'll explain the process I followed.
Process:
I have created the beans in my config class (com.myapp.users.config).
As I don't have repositories, I have given my Model class package name as "basePackage" path. (Please check the image)
For 1) I have replaced the "table name over-rider bean injection" to avoid the error.
For 2) I printed the name that is passing on to this method. But it is Null. So checking all the possible ways to pass the value here.
Check the image for error:
I haven't changed anything in my user model class as beans will replace the name of the DynamoDBTable when the beans got executed. But the table name over riding is happening. Data is pulling from the table name given at the Model Class level only.
What I am missing here?
The table names can be altered via an altered DynamoDBMapperConfig bean.
For your case where you have to Prefix each table with a literal, you can add the bean as such. Here the prefix can be the environment name in your case.
#Bean
public TableNameOverride tableNameOverrider() {
String prefix = ... // Use #Value to inject values via Spring or use any logic to define the table prefix
return TableNameOverride.withTableNamePrefix(prefix);
}
For more details check out the complete details here:
https://github.com/derjust/spring-data-dynamodb/wiki/Alter-table-name-during-runtime
I am able to achieve table names prefixed with active profile name.
First added TableNameResolver class as below,
#Component
public class TableNameResolver extends DynamoDBMapperConfig.DefaultTableNameResolver {
private String envProfile;
public TableNameResolver() {}
public TableNameResolver(String envProfile) {
this.envProfile=envProfile;
}
#Override
public String getTableName(Class<?> clazz, DynamoDBMapperConfig config) {
String stageName = envProfile.concat("_");
String rawTableName = super.getTableName(clazz, config);
return stageName.concat(rawTableName);
}
}
Then i setup DynamoDBMapper bean as below,
#Bean
#Primary
public DynamoDBMapper dynamoDBMapper(AmazonDynamoDB amazonDynamoDB) {
DynamoDBMapper mapper = new DynamoDBMapper(amazonDynamoDB,new DynamoDBMapperConfig.Builder().withTableNameResolver(new TableNameResolver(envProfile)).build());
return mapper;
}
Added variable envProfile which is an active profile property value accessed from application.properties file.
#Value("${spring.profiles.active}")
private String envProfile;
We have the same issue with regards to the need to change table names during runtime. We are using Spring-data-dynamodb 5.0.2 and the following configuration seems to provide the solutions that we need.
First I annotated my bean accessor
#EnableDynamoDBRepositories(dynamoDBMapperConfigRef = "getDynamoDBMapperConfig", basePackages = "my.company.base.package")
I also setup an environment variable called ENV_PREFIX which is Spring wired via SpEL.
#Value("#{systemProperties['ENV_PREFIX']}")
private String envPrefix;
Then I setup a TableNameOverride bean:
#Bean
public DynamoDBMapperConfig.TableNameOverride getTableNameOverride() {
return DynamoDBMapperConfig.TableNameOverride.withTableNamePrefix(envPrefix);
}
Finally, I setup the DynamoDBMapperConfig bean using TableNameOverride injection. In 5.0.2, we had to setup a standard DynamoDBTypeConverterFactory in the DynamoDBMapperConfig builder to avoid NPE.:
#Bean
public DynamoDBMapperConfig getDynamoDBMapperConfig(DynamoDBMapperConfig.TableNameOverride tableNameOverride) {
DynamoDBMapperConfig.Builder builder = new DynamoDBMapperConfig.Builder();
builder.setTableNameOverride(tableNameOverride);
builder.setTypeConverterFactory(DynamoDBTypeConverterFactory.standard());
return builder.build();
}
In hind sight, I could have setup a DynamoDBTypeConverterFactory bean that returns a standard DynamoDBTypeConverterFactory and inject that into the getDynamoDBMapperConfig() method using the DynamoDBMapperConfig builder. But this will also do the job.
I up voted the other answer but here is an idea:
Create a base class with all your user details:
#MappedSuperclass
public abstract class AbstractUser {
#Id
#GeneratedValue(strategy=GenerationType.AUTO)
private Long id;
private String firstName;
private String lastName;
Create 2 implentations with different table names and spirng profiles:
#Profile(value= {"dev","default"})
#Entity(name = "dev_user")
public class DevUser extends AbstractUser {
}
#Profile(value= {"prod"})
#Entity(name = "prod_user")
public class ProdUser extends AbstractUser {
}
Create a single JPA respository that uses the mapped super classs
public interface UserRepository extends CrudRepository<AbstractUser, Long> {
}
Then switch the implentation with the spring profile
#RunWith(SpringJUnit4ClassRunner.class)
#DataJpaTest
#Transactional
public class UserRepositoryTest {
#Autowired
protected DataSource dataSource;
#BeforeClass
public static void setUp() {
System.setProperty("spring.profiles.active", "prod");
}
#Test
public void test1() throws Exception {
DatabaseMetaData metaData = dataSource.getConnection().getMetaData();
ResultSet tables = metaData.getTables(null, null, "PROD_USER", new String[] { "TABLE" });
tables.next();
assertEquals("PROD_USER", tables.getString("TABLE_NAME"));
}
}

Using POST method for big queries in Spring Data Solr

I am using Spring Data Solr in my project. In some cases generated queries to Solr are too big (e.g.15Kb+) and cause Solr exceptions. This solution: http://codingtricks.fidibuy.com/participant/join/54fce329b760506d5d9e7db3/Spring-Data-Solr-cannot-handle-long-queries
still fails for some queries.
Since directly sending those queries to Solr via POST works fine, I chose to work in this direction. I failed to find in Spring Data Solr any way to configure the preferred method (GET/POST) for queries. Therefore, I came to the following solution: I extended SolrServer
public class CustomSolrServer extends HttpSolrServer {
public CustomSolrServer(String home, String core) {
super(home);
setCore(core);
}
#Override
public QueryResponse query(SolrParams params) throws SolrServerException {
METHOD method = METHOD.GET;
if (isBigQuery(params)) {
method = METHOD.POST;
}
return new QueryRequest( params, method ).process( this );
}
}
(some details skipped, setCore() and isBigQuery() are trivial and skipped as well)
and use it as SolrServer bean in SolrConfiguration.class:
#Configuration
#EnableSolrRepositories(basePackages = { "com.vvy.repository.solr" }, multicoreSupport=false)
#Import(value = SolrAutoConfiguration.class)
#EnableConfigurationProperties(SolrProperties.class)
public class SolrConfiguration {
#Autowired
private SolrProperties solrProperties;
#Value("${spring.data.solr.core}")
private String solrCore;
#Bean
public SolrServer solrServer() {
return new CustomSolrServer(solrProperties.getHost(),solrCore) ;
}
}
This works OK, but has a couple of drawbacks: I had to set multiCoreSupport to false. This was done because when Spring Data Solr implements repositories from the interfaces, with multiCoreSupport on it uses MultiCoreSolrServerFactory and tries to store a server per core, which is done by cloning them to the holding map. Naturally, it crashes on a customized SolrServer, because SolrServerUtils doesn't know how to clone() it. Also, I have to set core manually instead of enjoying Spring Data extracting it from #SolrDocument annotation's parameter on the entity class.
Here are the questions
1) the main and general question: is there any reasonable way to solve the problem of too long queries in Spring Data Solr (or, more specifically, to use POST instead of GET)?
2) a minor one: is there a reasonable way to customize SolrServer in Spring Data Solr and yet maintain multiCoreSupport?
Answer for Q1:Yes, u can using POST instead of GET.
Answer for Q2:Yes, u already have done a half.Except following:
1)u have to rename 'CustomSolrServer' to 'HttpSolrServer',u can check method
org.springframework.data.solr.server.support.SolrServerUtils#clone(T, java.lang.String)
for reason.
2)u don't have to specify concrete core name.U can specify core name using annotation
org.springframework.data.solr.core.mapping.SolrDocument
on corresponding solr model.
3)set multicoreSupport = true
According to your sample of classes, they should look like as following:
package com.x.x.config;
import org.apache.solr.client.solrj.SolrRequest;
import org.apache.solr.client.solrj.SolrServerException;
import org.apache.solr.client.solrj.request.QueryRequest;
import org.apache.solr.client.solrj.response.QueryResponse;
import org.apache.solr.common.params.SolrParams;
public class HttpSolrServer extends org.apache.solr.client.solrj.impl.HttpSolrServer {
public HttpSolrServer(String host) {
super(host);
}
#Override
public QueryResponse query(SolrParams params) throws SolrServerException {
SolrRequest.METHOD method = SolrRequest.METHOD.POST;
return new QueryRequest(params, method).process(this);
}
}
#Configuration
#EnableSolrRepositories(basePackages = { "com.vvy.repository.solr" }, multicoreSupport=true)
#Import(value = SolrAutoConfiguration.class)
#EnableConfigurationProperties(SolrProperties.class)
public class SolrConfiguration {
#Autowired
private SolrProperties solrProperties;
#Bean
public SolrServer solrServer() {
return new com.x.x.config.HttpSolrServer(solrProperties.getHost()) ;
}
}
ps: Latest spring-data-solr 3.x.x already support custom query request method,see post issue

Categories