I am trying to use Apache Ignite with Couchbase at the backend as persistence layer. I am doing this as a data grid so that any changes made to ignite in-memory cache gets written to couchbase eventually. I am implementing this with spring-boot and spring-data. The igniteConfiguration bean looks like this
#Bean(name = "igniteConfiguration")
public IgniteConfiguration igniteConfiguration() {
IgniteConfiguration igniteConfiguration = new IgniteConfiguration();
CacheConfiguration cache = new CacheConfiguration("sample");
cache.setAtomicityMode(CacheAtomicityMode.ATOMIC);
//The below line where I am confused
cache.setCacheStoreFactory(FactoryBuilder.factoryOf(CacheStoreImplementationWithCouchbaseRepositoryBean.class);
cache.setCacheStoreSessionListenerFactories()
cache.setReadThrough(true);
cache.setWriteThrough(true);
igniteConfiguration.setCacheConfiguration(cache);
return igniteConfiguration;
}
I need to provide one implementation of cacheStore interface of Ignite in order to connect couchbase as backend data store. I configured Couchbase Repository class and created a bean of it in the CacheStoreImplementationWithCouchbaseRepositoryBean.java. But this repository bean is not getting initiated because CacheStoreImplementation class is not in spring context and getting null always.
As I used spring-data. Now I have
Spring data repository for Ignite
Spring data repository for couchbase
One implementation of cacheStore interface of ignite
But not sure how to send the couchbase repository bean to the cacheStore implementation in a way so that it uses this repository class to execute crud operation in couchbase internally.
bivrantoshakil -
"I need to provide one implementation of cacheStore interface of Ignite in order to connect couchbase as backend data store. I configured Couchbase Repository class and created a bean of it in the CacheStoreImplementationWithCouchbaseRepositoryBean.java"
What documentation/tutorial are you following?
Please zip up and post your project and I will investigate.
I have a few tangential thoughts
you may which to just use Couchbase without Ignite, as the key-value api is really fast, with persistence being asynchronous and configurable.
There is a tutorial of spring data caching with couchbase at https://blog.couchbase.com/couchbase-spring-cache/ You can skip to "Adding Spring Boot Caching Capabilites".
There is tutorial Apache Ignite here - https://www.baeldung.com/apache-ignite-spring-data
Take a look here to see how store factories are configured: https://github.com/apache/ignite/tree/master/examples/src/main/java/org/apache/ignite/examples/datagrid/store
I recommend decoupling your project from CouchBase and/or spring-data to make it simpler, and then adding the appropriate components, one by one, to see where the failure is.
If you need one bean to be initialized before another bean consider using the appropriate annotations.
I found one work around for the issue I was having above. First I created one class to get the Spring Application Context
import org.springframework.beans.BeansException;
import org.springframework.context.ApplicationContext;
import org.springframework.context.ApplicationContextAware;
import org.springframework.stereotype.Component;
#Component
public class ApplicationContextProvider implements ApplicationContextAware {
private static ApplicationContext context;
/**
* Returns the Spring managed bean instance of the given class type if it
* exists. Returns null otherwise.
*
* #param beanClass
* #return
*/
public static <T extends Object> T getBean(Class<T> beanClass) {
return context.getBean(beanClass);
}
#Override
public void setApplicationContext(ApplicationContext context) throws BeansException {
// store ApplicationContext reference to access required beans later on
ApplicationContextProvider.context = context;
}
}
and then in the CacheStoreImplementationWithCouchbaseRepositoryBean.class I did this -
//Repository bean
private CouchbaseRepository repository;
#Override
public Employee load(Long key) {
repository = ApplicationContextProvider.getBean(CouchbaseRepository.class);
return repository.findById(key).orElse(null);
}
There are examples of using couchbase cache in
git clone git#github.com:spring-projects/spring-data-couchbase.git
cd spring-data-couchbase
./java/org/springframework/data/couchbase/cache/CouchbaseCacheCollectionIntegrationTests.java
./java/org/springframework/data/couchbase/cache/CouchbaseCacheIntegrationTests.java
Related
I am just starting to learn Spring Boot and am trying to use cache with the Google Sheets API to study Springboot. To use cache with Google Sheets in Spring Boot, I added the spring-boot-starter-cache dependency in my pom.xml file and enabled cache support by using the #EnableCaching annotation in my application's main class. I then used the #Cacheable annotation for caching the response from the Google Sheets API when calling a method to get a spreadsheet. However, the cache does not work, as the method is still being called every time.
Here is an example of the code I am using:
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.cache.annotation.Cacheable;
import org.springframework.stereotype.Service;
#Service
public class GoogleSheetsService {
private static final Logger logger = LoggerFactory.getLogger(GoogleSheetsService.class);
#Cacheable(value = "spreadsheet", key = "#spreadsheetId")
public Spreadsheet getSpreadsheet(String spreadsheetId) {
logger.info("Getting spreadsheet from API for {}", spreadsheetId);
// Call Google Sheets API to get spreadsheet
return spreadsheet;
}
}
I expected the cache to return the cached value for the same spreadsheetId, but the method is being called every time and the "Getting spreadsheet from API for {}" log is being printed. Can you please help me understand what might be causing the cache to not work as expected?
I can doubt two things.
Are other methods of Google Sheets Service calling the spreadsheet method?
#Cacheable uses the AOP concept of Spring Framework.
So it doesn't apply to calling other methods in the same class.
Have you registered the spreadsheet with CacheManager?
With Java Config, it is as follows,
#Configuration
#EnableCaching
public class CacheConfig {
#Bean
public CacheManager cacheManager() {
ConcurrentMapCacheManager mgr = new ConcurrentMapCacheManager();
mgr.setCacheNames(asList("employees"));
return mgr;
}
}
link
Is the spreadsheet registered with cacheManager?
In addition to a non-reactive JPA repository, I introduced a reactive repository in my Spring Boot app with H2 database.
com.app.respository.BusinessRepository extends JpaRepository
com.app.respository.r2dbc.PendingBusinessRepository extends ReactiveCrudRepository
And I added a connection factory for reactive H2.
#Configuration
#EnableR2dbcRepositories
public class R2DBCConfiguration extends AbstractR2dbcConfiguration {
#Bean
public H2ConnectionFactory connectionFactory() {
return new H2ConnectionFactory(
H2ConnectionConfiguration.builder()
.url("jdbc:h2:file:~/data/demo-rxdb")
.username("sa")
.password("password")
.build());
}
}
After this change, my application is not able to find the non-reactive repository. It says:
Field businessRepository in
com.app.service.BusinessServiceImpl required a bean
of type 'com.app.repository.BusinessRepository' that could not be
found.
The injection point has the following annotations:
#org.springframework.beans.factory.annotation.Autowired(required=true)
Action:
Consider defining a bean of type
'com.app.repository.BusinessRepository' in your configuration.
I can only guess that maybe my H2 database is now reactive (non-blocking) which is not supported by a JPARepository (blocking). But is my assumption correct?
I have mixed needs. I need only PendingBusiness table data in a non-blocking way (which is constantly being fluxed out on UI through event stream) and I need the rest of the tables' data in a traditional blocking way. Is it possible to achieve what I want through a single H2 database instance?
I'm trying to build REST application using following tech stack:
Spring
VueJs
JPA (Hibernate)
This is my first experience in writing Sping application and web app development overall.
I have 4 tables in my DataBase:
Language
Sentence
Rule
User
For example in Rule there is :
Rule create(EntityManagerFactory factory, String type, String hint, String help, Language language);
List<Rule> readAll(EntityManagerFactory factory);
Rule readID(EntityManagerFactory factory, int id);
void update(EntityManagerFactory factory, String type, String hint, String help, Language language);
So there is my questions:
When I create Controllers for each table, I use the CRUD methods to modify (or not) my database, and I return a view for my HTML and VueJS part. But my method need an EntityManagerFactory, should I create a field in each Controllers class or this is not how I'm supposed to do ?
Do I need to create a bean file and configure it or persistence.xml and pom.xml are enough?
Thanks
Seems like your first question can be broken up into multiple concerns.
When I create Controllers for each table, I use the CRUD methods to modify (or not) my database, and I return a view for my HTML and VueJS part. But my method need an EntityManagerFactory, should I create a field in each Controllers class or this is not how I'm supposed to do?
Since you have already accepted an answer that recommends the use of spring-data-jpa. You will be dealing with entities and repositories.
Entities are JPA managed beans that will interact with your database.
#Entity
#Table(name = "rule")
public class Rule {
#Id
#GeneratedValue(strategy=GenerationType.AUTO)
long id;
String type;
...
#OneToOne
#JoinColumn(...)
Language language;
}
Repositories will provide all the necessary operations required to perform an action against your database. With JPA you can create an interface that extends CrudRepository which would provide you with some CRUD operations that come free with it. findOne(/* id */), delete(), save()
#Repository
public interface RuleRepository extends CrudRepository<Rule, Long> {
// You can easily specify custom finders
public List<Rule> findByType(String type);
}
But my method need an EntityManagerFactory, should I create a field in each Controllers class or this is not how I'm supposed to do?
It's typically frowned upon to have a request/response object to be JPA entity. See the linked answer for should i use jpa entity in rest request and/or response
There are multiple approaches that you can take to take a controller request and send a response to your client side project.
#Controller
public class RuleController {
#Autowired
private RuleRepository ruleRepository;
// Approach 1: Use Request Parameters - enforce inputs
#PostMapping("/rule/:id")
public Rule postWithRequestParams(#PathParam("id") Long id,
#RequestParam("type") String type,
#RequestParam("hint") String hint,
#RequestParam("languageField1") String languageField1) {
Rule inputRule = new Rule(id, type, hint, new Language(languageField1));
Rule responseRule = ruleRepository.save(inputRule);
return responseRule; // I would imagine you would want to set up a model for the response itself
}
// Approach 2: Use RequestBody - serialize Rule from the request
#PostMapping("/rule/:id")
public Rule postWithRequestParams(#PathParam("id") Long id, #RequestBody Rule inputRule) {
Rule responseRule = ruleRepository.save(inputRule);
return responseRule;
}
Do I need to create a bean file and configure it or persistence.xml and pom.xml are enough?
If you have added spring-boot-starter-data-jpa as a dependency, a lot of the bean configuration has already been done for you.
In your main/src/resources (you should have an application.properties or application.yml)
spring.datasource.url= # JDBC url of the database.
spring.datasource.username= # Login user of the database.
spring.datasource.password= # Login password of the database.
Spring does a lot of the magic and heavy lifting for you.
If you are using spring Boot then you don't need entity manager. All you need to do is to define Datasource in you properties file. And create a Bean in our Configuration class just like:
#Bean
#Primary
#ConfigurationProperties(prefix = "spring.datasource")
public DataSource datasource() {
return DataSourceBuilder.create().build();
}
Now rest of the things you can handle with repositories.They will be like:
import org.springframework.data.repository.CrudRepository;
public interface RuleRepository extends CrudRepository<Rule, Long> {
}
In your controllers you will use it like:
#Autowired
private RuleRepository ruleRepository;
#Get
#Path("/getRule/:id")
public Rule find(#PathParam("id")Long id){
return ruleRepository.findOne(id);
}
These dependencies I used in my gradle project. You will find Maven version for same:
compile('org.springframework.boot:spring-boot-starter-data-jpa')
compile group: 'mysql', name: 'mysql-connector-java'
You definitely need to have a look on Spring Boot(http://start.spring.io), it allows easier to start web app development. As for persistence layer you could use Spring Data JPA(already includes Hibernate) module which also can be easily integrated with Spring Boot. The beauty of Spring Data that is already have written by default most queries like save(), remove(), find() and so on. You only need to define Objects which will be used by Spring Data.
Update: See my Spring Boot REST API example here
I have a REST API built on Spring Boot consisting of 2 seperate web services. I don't know if those two web services will be hosted on the same machine so I want to make remote and local implementation for all services. Example below:
Local service implementation:
public class LocalExampleService implements ExampleService{
public Item getItem(long id){
//Get item using implementation from another local project
}
}
Remote service implementation:
public class RemoteExampleService implements ExampleService{
#Value("${serviceURL}")
private String serviceURL;
public Item getItem(long id){
//Get item calling remote service
}
}
Controller:
public class MyController{
#Autowired
private ExampleService exampleService;
}
Web service has many services with local and remote implementation and I want to let Spring know which type of implementation it should choose for all services.
I've been thinking about putting url in properties file and during intialization the app would check whether properties contain url and then autowire service approprietly. But then I would have to write logic for every service autowiring.
What's the best option to autowire correct service implementation automatically?
You can use Spring profiles to control which version of implementation should be used via spring properties.
In spring properties add below entry
spring.profiles.active=NAME_OF_ACTIVE_PROFILE
Every service implementation needs profile annotation. That's how your services implementation should look like:
#Component
#Profile("local")
public class LocalExampleService implements ExampleService{}
#Component
#Profile("remote")
public class RemoteExampleService implements ExampleService{}
If your project needs to use local implementation of a service then in properties instead of NAME_OF_ACTIVE_PROFILE insert local otherwise remote.
For fully automatic auto-wiring you need to add method running at the startup that checks whether local implementation class exists and then set profile properly. To do this you need to modify code in spring boot main method:
public static void main(String[] args){
String profile = checkCurrentProfile(); //Method that decides which profile should be used
System.setProperty(AbstractEnvironment.ACTIVE_PROFILES_PROPERTY_NAME, profile);
SpringApplication.run(MyApplication.class, args);
}
If you choose this approach then you don't need previous entry in properties file.
My attempt to implement something like this https://github.com/StanislavLapitsky/SpringSOAProxy
The idea is to check if a spring bean cannot be found locally then automatically create a Proxy which uses RestTemplate internally to call the same service remotely.
You need to define contract - services interfaces plus DTO and define URL resolver to specify which URL should be used for each service.
Hello I am trying to create an application using dropwizard framework. I have the DAO classes impl which needs an handle to connection manager instance which will then be used to get database connections. I have a multi tenant database application. This connection manager would be a custom implementation.
The application uses hikari cp as connection pool and mysql database. I want to initialize the datasource and connection pool using dropwizard managed object feature. Once the datasource is initialized I want to inject the connection manager instance in each of dao classes using guice binding something like
bind(ConnectionManager.class).toProvider(ConnectionManagerProvider.class);
Then in each dao impl classes
#Inject
public class UserDAOIpl extends AbstractDAO {
protected UserDAOImpl(ConnectionManager connectionManager) {
super(connectionManager);
}
}
I have looked everywhere on the net there is no particular example for my use case. Also there is a lack of documentation at dropwirzard.io
This is more of an architectural design question rather than code question.
The datasource module would be a separate module which would be used in many service. I am using maven as build tool.
My questions are
How I can approach this situation ? Some class names and implementation guide lines would be very useful.
The application would be handing half a million requests a day. The solution should be feasible.
I look forward to community for any guidance or if any body can point me to some good resources.
NOTE: We won't be using hibernate for this application and would be using JDBI.
I prepared a setup similar to the one you described as follows. It sets up guice, initializes a DBIFactory (you might need to adopt that part to your scenario). Then a JDBI object is handed over to a repository implementation that can use it to persist an entity of type Vessel.
(1) Adding guice to the project
<dependency>
<groupId>com.hubspot.dropwizard</groupId>
<artifactId>dropwizard-guice</artifactId>
<version>x.x.x</version>
</dependency>
(2) Setup Guice in initialize():
guiceBundle = GuiceBundle.<YourConfiguration>newBuilder()
.addModule(new GuiceModule())
.enableAutoConfig("your.package.name.heres")
.setConfigClass(YourConfiguration.class)
.build();
(3) Guice config for preparing JDBI elements
public class GuiceModule extends AbstractModule {
private DBI jdbi;
#Provides
public DBI prepareJdbi(Environment environment,
SightingConfiguration configuration) throws ClassNotFoundException {
// setup DB access including DAOs
// implementing a singleton pattern here but avoiding
// Guice to initialize DB connection too early
if (jdbi == null) {
final DBIFactory factory = new DBIFactory();
jdbi = factory.build(environment, configuration.getDataSourceFactory(), "h2");
}
return jdbi;
}
#Provides
public VesselJDBI prepareVesselJdbi(DBI jdbi) {
return jdbi.onDemand(VesselJDBI.class);
}
#Override
protected void configure() {
bind(VesselRepository.class).to(VesselRepositoryImpl.class);
/* ... */
}
}
(4) start using it in your classes
public class VesselRepositoryImpl implements VesselRepository {
private VesselJDBI jdbi;
#Inject
public VesselRepositoryImpl(VesselJDBI jdbi) {
this.jdbi = jdbi;
}
public Vessel create(Vessel instance) {
return jdbi.inTransaction((transactional, status) -> {
/* do several things with jdbi in a transactional way */
});
}
}
(please note: the last code example used Java 8. To use JDBI with Java 8 with Dropwizard 0.8.1 please use jdbi version 2.62 to avoid bug https://github.com/jdbi/jdbi/issues/144)
Please let me know if this helped you.
Best regards,
Alexander
I can't comment, but wanted to add on to Alex's answer:
For the repository implementation, I recommend having the repository be handled by jDBI instead of using Guice. Here's what I did:
In the Guice module, add a provide method:
#Provides
#Singleton
public void repository(Dbi dbi) {
// dbi.onDemand(whateverYourClassIs.class)
}
in the repository class, use #CreateSqlObject to have your DAOs available:
public abstract class Repo {
#CreateSqlObject
abstract Dao dao(); // will return a jDBI managed DAO impl
public void doWhatever() {
/// logic
}
}
This has the distinct advantage that you can now use jDBI annotations. (I have not found a way to use them with guice directly). This is very nice for example, if you need to execute DAO code in a transaction. The Repository is still handled within Guice so it can be injected anywhere, but jDBI handles the tricky bits within your DAO/Repository code.
Hope this helps :)
Artur