Spring Boot Rest Api stucked after N parallel save requests - java

I'm creating a Rest Api with Spring Boot. I'm using the default configuration of Hikari, so it has a pool size of 10 connections.
I encountered an error when i try to post 10 parallel requests to a specific route. The error says Connection is not available, request timed out after 30001ms.
This route saves data into a MySQL database, it usually takes some milliseconds to complete one save operation.
Why is this problem happening? Should it complete the save operation and then free the database connection for the next operation?
This error seems to appear only with saving operations where the save function creates a new entity.
Properties
spring.jpa.hibernate.ddl-auto=create-drop
spring.datasource.url=jdbc:mysql://192.168.1.88:3306/question?serverTimezone=UTC
spring.datasource.username=luigi
spring.datasource.password=root
Repository
public interface RandomRepository extends CrudRepository<Random, Integer> {
}
Controller
#RestController
public class RandomController {
private RandomRepository randomRepository;
public RandomController(RandomRepository randomRepository) {
this.randomRepository = randomRepository;
}
#GetMapping("/")
public String createRandom() {
return String.valueOf(Math.random());
}
#PostMapping("save")
public Random save(){
Random random = new Random();
random.setNumber(Math.random());
randomRepository.save(random);
return random;
}
}
I've found a solution by making the save method synchronized. Is this the right way? Did someone encountered the same problem?
Possible solution
#PostMapping("save")
public synchronized Random save(){
Random random = new Random();
random.setNumber(Math.random());
randomRepository.save(random);
return random;
}
I expected that the save operation would be completed easly, but it stucks until it crashes after 30 seconds

We faced a very similar issue when working with Spring Boot & Couchbase.
When the number of requests was high, the connection to the DB got stuck.
The solution that we used was to move to Async method calls for all levels - from the controller down to the DB operations.
See this post & answer

Related

Cron Scheduler with WebClient

I am working with a spring boot. I am trying to send data from one database to the other.
First, I did this by making a get request to get the data from the first database and applied post through Web Client to send the data to the other database. It worked!
But when I tried to do it with cron scheduler with #Scheduled annotation it's not posting the data to the database. Even though the function is working fine, as i tried printing stuff through that function, but the WebClient is not posting the data (also checked the data, it was fine).
The Cron class is:
#Component
public class NodeCronScheduler {
#Autowired
GraphService graphService;
#Scheduled(cron = "*/10 * * * * *")
public void createAllNodesFiveSeconds()
{
graphService.saveAlltoGraph("Product");
}
}
saveAlltoGraph function takes all the tuples from a Product table and send post request to the api of graph database, which makes node from the tuples.
Here is the function:
public Mono<Statements> saveAlltoGraph(String label) {
JpaRepository currentRepository = repositoryService.getRepository(label);
List<Model> allModels = currentRepository.findAll();
Statements statements = statementService.createAllNodes(allModels, label);
//System.out.println(statements);
return webClientService.sendStatement(statements);
}
First, the label "Product" is used to get the JpaRepository related to that table. Then we fetch all the tuples of that table in the list, and we create objects according to that, (We can use a serializer to get the JSON).
Here is the sendStatement function:
public Mono<Statements> sendStatement(Statements statements){
System.out.println(statements);
return webClient.post().uri("http://localhost:7474/db/data/transaction/commit")
.body(Mono.just(statements), Statements.class).retrieve().bodyToMono(Statements.class);
}
Everything is working when we call this saveAlltoGraph using a get request mapping, but not working with the scheduler.
I tried with adding .block() and .subscribe() to that. And things started working with the cron scheduler.

How to save List and return PagedResources in Spring

I have a List I need to return in a Spring Hateoas powered REST API as a PagedResources. I have tried this:
List<User> users = someUserGenerationMethod();
PageImpl<User> page = new PageImpl<User>(users);//users size is greater than 1
return parAssembler.toResource(page, userResourceAssembler);
having:
#Autowired
private PagedResourcesAssembler<User> parAssembler;
and userResourceAssembler is an instance of:
public class UserResourceAssembler extends ResourceAssemblerSupport<User, UserResource> {...}
and:
public class UserResource extends ResourceSupport{...}
but it results on java.lang.IllegalArgumentException: Page size must not be less than one!
How could I achieve that?
The problem was instantiating PageImpl, not sure why but using a different constructor:
Page<User> page = new PageImpl<User>(users, new PageRequest(0, DEFAULT_USER_PAGE_SIZE), 1);
solved the problem. Does anybody know why? Bug or bad use?
There are two ways to achieve it.
Either re-query the database repo.findAll() to get list of users from database (this is useful in case there are some db processing involved like time-stamping or seed generation for id).
If no db-processing involved then I will return generatedUsers instead of savedUsers. (In this case make sure that repo.saveAll(generatedUsers) executed successfully and without errors).

How can I call a function on start and periodically on Playframework 2.5

I need to make a WS petition when I start play so I can log in a external service to obtain a token. I need that token for make future petitions. I know how to make WS petitions, I don't know where to place that code to execute on start. At this time, it is in a function of a controller.
If you want some code of this:
// login data
ObjectNode tvdbaccount = Json.newObject();
tvdbaccount.put("apikey", "*****");
tvdbaccount.put("username", "*****");
tvdbaccount.put("userkey", "*****");
// try to login
String token = "";
CompletionStage<JsonNode> request = WS.url("https://api.thetvdb.com/login")
.post(tvdbaccount)
.thenApply(WSResponse::asJson);
try {
JsonNode response = request.toCompletableFuture()
.get(5, TimeUnit.SECONDS);
token = response.get("token").asText();
} catch (Exception ex) {
System.out.println(ex.getMessage());
}
That token expires after 24 hours so I want for example to call a function every 12 hours that refreshes that token. That function is similar to the previous one, it's only a WS petition.
I'm using playframework 2.5 where GlobalSettings is deprecated and I see multiple answers not very clear for 2.5 so I fail to get it done.
Thanks to Alexander B, I've been able to get what I wanted.
Eager Singleton
I resolved the on start function call with the eager singleton.
What I've done is a class for the TVDB things and the important part is to write what you want to do on start inside the constructor of this class. And then to bind it on a module:
bind(TVDB.class).asEagerSingleton();
Akka actors
For the periodically function call I used an Akka actor.
I've implemented an actor which calls itself every 12 hours, so I placed the scheduling code in the same actor on the preStart void. I think the Playframework documentation for the Scheduling asynchronous tasks isn't updated and doesn't work the way it is (at least for me).
Then, I binded it on the module:
bindActor(TVDBActor.class, "TVDBActor");
If someone need the actor code here it is:
public class TVDBActor extends UntypedActor {
#Inject
public void preStart(final ActorSystem system, #Named("TVDBActor") ActorRef tvdbActor) {
system.scheduler().schedule(
Duration.create(12, TimeUnit.HOURS),
Duration.create(12, TimeUnit.HOURS),
tvdbActor,
"tick",
system.dispatcher(),
null
);
}
#Override
public void onReceive(Object msg) throws Exception {
TVDB.refreshToken();
}
}

Playframework 1.2.5 and JDBI

I am trying to use JDBI with Play 1.2.5 and im having a problem with running out of database connections. I am using the H2 in-memory database (in application.conf, db=mem)
I have created class to obtain jdbi instances that uses Play's DB.datasource like so:
public class Database {
private static DataSource ds = DB.datasource;
private static DBI getDatabase() {
return new DBI(ds);
}
public static <T> T withDatabase(HandleCallback<T> hc) {
return getDatabase().withHandle(hc);
}
public static <T> T withTransaction(TransactionCallback<T> tc) {
return getDatabase().inTransaction(tc);
}
}
Every time I do a database call, a new DBI instance is created but it always wraps the same static DataSource object (play.db.DB.datasource)
Whats happening is, after a while I am getting the following:
CallbackFailedException occured : org.skife.jdbi.v2.exceptions.UnableToObtainConnectionException: java.sql.SQLException: An attempt by a client to checkout a Connection has timed out.
I am confused because the whole point of DBI.withHandle() and DBI.withTransaction() is to close the connection and free up resources when the callback method completes.
I also tried making getDatabase() return the same DBI instance every time, but the same problem occured.
What am I doing wrong?
Duh. Turns out I was leaking connections in some old code that wasn't using withHandle(). As soon as I upgraded it the problem stopped
From the official documentation
Because Handle holds an open connection, care must be taken to ensure that each handle is closed when you are done with it. Failure to close Handles will eventually overwhelm your database with open connections, or drain your connection pool.
Turns out you are not guaranteeing the closing of the handle in your callback function whenever it is provided.

How do I insert a lot of entities in a Play! Job?

In my application I have to simulate various situations for analysis. Thus insert a (very) large amount of lines into a database. (We're talking about a very large amount of data...several billion)
Model
#Entity
public class Case extends Model {
public String url;
}
Job
public class Simulator extends Job {
public void doJob() {
for (int i = 0; i !=) {
// Somestuff
new Case(someString).save();
}
}
}
After half an hour, there is still nothing in the database. But debug traces show Play inserts some stuff. I suspect it is some kind of cache.
I've tried about everything :
Model.em().flush();
Changes nothing.
Model.em().getTransaction().commit();
throws TransactionRequiredException occured : no transaction is in progress
Model.em().setFlushMode(FlushModeType.COMMIT);
Model.em().setFlushMode(FlushModeType.AUTO);
Changes nothing.
I've also tried #NoTransaction annotations everywhere :
Class & functions in Controller
Class Case
Overriding save method in Model
Class & functions of my Job
Getting quite desperate. Every kind of advice is welcome.
EDIT : After a little research, the first row appears in database. The associated ID is about 550.000. That means about half a million rows are somewhere in between my application and database.
Try
em.getTransaction().begin();
em.persist(model);
em.getTransaction().commit();
You can't commit a transaction before you begin it.
as per documentation, the job should have its own transaction enabled as Play request do, so that's not the issue. Try doing this:
for (int i = 0; i !=) {
// Somestuff
Case tmp = new Case(someString);
tmp = JPA.em().merge(tmp);
tmp.save();
}
The idea is that you add the newly created object to the EntityManager explicitly before saving, making sure the object is part of the "dirty objects" that will be persisted.
You need to instruct Play! when it should run your job by annotating your class with one of these annotations #OnApplicationStart, #Every or #On.
Please check Play! documentation on jobs

Categories