I have just rest controller, service and dao methods and . Checking if there is an existing student in the db if it is present then updating the existing object with current request values and storing in the database. But some how in multi threaded environment the values are getting overwritten for few of the records. My understanding is each thread has its own stack and variables that live inside stack which is specific to thread. In this scenario not understanding how the student object is getting overriden some times. Could any one have any idea could you please provide some inputs.
#RestController
public class HelloWorldController {
#Autowired
private StudentService service;
#RequestMapping(value = "/student", method = RequestMethod.PUT)
public String updateStudent(StudentDto studentDto) {
service.updateStudent(studentDto);
return "Success";
}
}
...
#Service
public class StudentService {
#Autowired
private StudentDao studentDao;
public void updateStudent(StudentDto dto) {
Student student = studentDao.findByStudentId(dto.getId());
if (student != null) {
student.setName(dto.getName());
studentDao.update(student); // dao extends couchbaseReository
}
}
}
I think you can use :
studentDao.save(student);
instead of
studentDao.update(student);
And if you want to handle task in parallel thread you can use the Annotation #Async (Spring)
Related
I have the next problem. When a controller invokes a service method with #Transactional annotation and this method invokes a #Transactional(readonly=true) method, the changes in the object doesn't get saved in the database. I don't know why. I thought that the first method begins a 'read and write' transaction and the second method uses the same transaction.
Example :
/** Controller class **/
#RestController
#RequestMapping("/path1")
public class MyObjectRestController {
...
#Autowired
MyObjectService myObjectService;
#PostMapping(value = "/path2")
#ResponseStatus(HttpStatus.CREATED)
public void save(#RequestBody MyObject myObject) {
myObjectService.save(myObject);
}
...
}
/** Service class **/
public interface MyObjectService {
public MyObject save(MyObject myObject);
public MyObject findOne(long id);
}
/** Service implementation class **/
#Service
public class MyObjectServiceImpl implements MyObjectService {
/**
* Save Object
*/
#Override
#Transactional
public Object save(Object newObject) {
Object myObject = this.findOne(newObject.getId());
// Modify Object
myObject.setAttribute1(...)
myObject.setAttribute2(...)
// Save Object (using Repository)
return MyObjectDao.save(myObject);
}
/**
* Get Object by Id
*/
#Override
#Transactional(readOnly = true)
public MyObject findOne(long id) {
Optional<MyObject> myObject = MyObjectDao.findById(id);
return myObject.isPresent() ? myObject.get() : null;
}
}
This example doesn't update the object attributes. But if I remove the #Transactional annotation in save(..) method ... this example works successfully! (the object is modified).
Why is this happening?
I see save method is annotated with #Transactional and calls findOne method which is also #Transactional. So in this case the same transaction is getting propagated to findOne method and it applies readonly behaviour to the current transaction. And thus it does not perform Flush and you see entity is not modified.
So in this case, you should use a separate transaction for read and writes.
findOne method can be annotated with below -
#Transactional(propagation = Propagation.REQUIRES_NEW, readOnly = true)
I'm using Spring Rest. I have an Entity called Operator that goes like this:
#Entity
#Table(name = "operators")
public class Operator {
//various properties
private List<OperatorRole> operatorRoles;
//various getters and setters
#LazyCollection(LazyCollectionOption.TRUE)
#OneToMany(mappedBy = "operator", cascade = CascadeType.ALL)
public List<OperatorRole> getOperatorRoles() {
return operatorRoles;
}
public void setOperatorRoles(List<OperatorRole> operatorRoles) {
this.operatorRoles = operatorRoles;
}
}
I also have the corresponding OperatorRepository extends JpaRepository
I defined a controller that exposes this API:
#RestController
#RequestMapping("/api/operators")
public class OperatorController{
private final OperatorRepository operatorRepository;
#Autowired
public OperatorController(OperatorRepository operatorRepository) {
this.operatorRepository = operatorRepository;
}
#GetMapping(value = "/myApi")
#Transactional(readOnly = true)
public MyResponseBody myApi(#ApiIgnore #AuthorizedConsumer Operator operator){
if(operator.getOperatorRoles()!=null) {
for (OperatorRole current : operator.getOperatorRoles()) {
//do things
}
}
}
}
This used to work before I made the OperatorRoles list lazy; now if I try to iterate through the list it throws LazyInitializationException.
The Operator parameter is fetched from the DB by a filter that extends Spring's BasicAuthenticationFilter, and is then somehow autowired into the API call.
I can get other, non-lazy initialized, properties without problem. If i do something like operator = operatorRepository.getOne(operator.getId());, everything works, but I would need to change this in too many points in the code.
From what I understand, the problem is that the session used to fetch the Operator in the BasicAuthenticationFilter is no longer open by the time i reach the actual API in OperatorController.
I managed to wrap everything in a OpenSessionInViewFilter, but it still doesn't work.
Anyone has any ideas?
I was having this very same problem for a long time and was using FetchType.EAGER but today something has clicked in my head ...
#Transactional didn't work so I thought "if declarative transactions don't work? Maybe programmatically do" And they do!
Based on Spring Programmatic Transactions docs:
public class JwtAuthorizationFilter extends BasicAuthenticationFilter {
private final TransactionTemplate transactionTemplate;
public JwtAuthorizationFilter(AuthenticationManager authenticationManager,
PlatformTransactionManager transactionManager) {
super(authenticationManager);
this.transactionTemplate = new TransactionTemplate(transactionManager);
// Set your desired propagation behavior, isolation level, readOnly, etc.
this.transactionTemplate.setPropagationBehavior(TransactionDefinition.PROPAGATION_REQUIRED);
}
private void doSomething() {
transactionTemplate.execute(transactionStatus -> {
// execute your queries
});
}
}
It could be late for you, but I hope it helps others.
I have a server built with java and spring.
What i am trying to do is that my controller with the same endpoint will get two different objects.
This is an example for what I mean:
I know I can do that:
public class Option1{
private String name;
...
//getter and setter
}
public class Option2{
private Long id;
...
//getter and setter
}
#Controller
public class Controller{
#RequestMapping(value = "service/getData/option1", method = RequestMethod.POST)
#ResponseBody
public String searchProv(#ResponseBody Option1 data1){
return "option1"
}
#RequestMapping(value = "service/getData/option2", method = RequestMethod.POST)
#ResponseBody
public String searchProv(#ResponseBody Option2 data2){
return "option2"
}
}
but I wonder if it is possible to passing different json object to the same endpoint and do that:
#Controller
public class Controller{
#RequestMapping(value = "service/getData", method = RequestMethod.POST)
#ResponseBody
public ResponseEntity<Any> getData(#ResponseBody Option1And2 data){
if(data instanceof Option1){
return return ResponseEntity<Any>(data.name,HttpStatus.OK)
}
if(data instanceof Option2){
return ResponseEntity<Any>(data.id,HttpStatus.OK)
}
return ResponseEntity<Any>("ok",HttpStatus.OK)
}
such that 'Option1And2' is generic object can be option1 or option2.
I tried to replace 'Option1And2' to 'Any' but it didn't went well because I get a list of keys and values
You should use JsonNode object.
for your example you should do this:
#Controller
public class Controller{
#RequestMapping(value = "service/getData", method = RequestMethod.POST)
#ResponseBody
public ResponseEntity<Any> getData(#RequestBody JsonNode jsonNode){
ObjectMapper obj = new ObjectMapper();
if(jsonNode.has("name"){
Option1 result= obj.convertValue(jsonNode,Option1.class)
return ResponseEntity<Any>(result.name,HttpStatus.OK)
}
else {
Option2 result= obj.convertValue(jsonNode,Option2.class)
return ResponseEntity<Any>(result.id,HttpStatus.OK)
}
return ResponseEntity<Any>("ok",HttpStatus.OK)
}
the JsonNode and the ObjectMapper you should import from here:
import com.fasterxml.jackson.databind.ObjectMapper
import com.fasterxml.jackson.databind.JsonNode;
this link should help you to understand better on JsonNode and give you more details.
and this link should help you with the convertValue from JsonNode to java object(POJO).
This is a good time to use inheritance and Java Generics. It is worth noting, if your controller has any dependencies such as a #Service or #Repository, then those too must be generic.
You might have a generic controller:
abstract class GenericController<T> {
public abstract GenericService<T> getService();
#GetMapping
public ResponseEntity<Iterable<T>> findAll() {
return ResponseEntity.ok(getService().findAll());
}
#PostMapping
public ResponseEntity<T> save(T entity) {
return ResponseEntity.ok(getService().save(entity));
}
// #DeleteMapping, #PutMapping
// These mappings will automatically be inherited by
// the child class. So in the case of findAll(), the API
// will have a GET mapping on /category as well as a GET
// mapping on /product. So, by defining and annotating the
// CRUD operations in the parent class, they will automatically
// become available in all child classes.
}
#Controller
#RequestMapping("/category")
class CategoryContr extends GenericController<Category> {
#Autowired CategoryServ serv;
#Override
public GenericService<Category> getService() {
return serv;
}
}
#Controller
#RequestMapping("/product")
class ProductContr extends GenericController<Product> {
#Autowired ProductServ serv;
#Override
public GenericService<Product> getService() {
return serv;
}
}
You then have to have abstract versions of the dependencies. The services:
abstract class GenericService<T> {
public abstract GenericRepository<T> getRepository();
public Iterable<T> findAll() {
return getRepository().findAll();
}
public T save(T entity) {
return getRepository().save(entity);
}
}
#Service
class CategoryServ extends GenericService<Category> {
#Autowired CategoryRepo repo;
#Override
public GenericRepository<Category> getRepository() {
return repo;
}
}
#Service
class ProductServ extends GenericService<Product> {
#Autowired ProductRepo repo;
#Override
public GenericRepository<Product> getRepository() {
return repo;
}
}
Then, the services have their dependencies as well - the repositories:
#NoRepositoryBean
interface GenericRepository<T> extends JpaRepository<T, Long> {
}
#Repository
interface CategoryRepo extends GenericRepository<Category> {
}
#Repository
interface ProductRepo extends GenericRepository<Product> {
}
This was my first approach. It works very nicely. However, this does create a strong coupling between the business logic of each service and the generic service. The same holds true for the generic controller and its child classes. You can of course always override a particular CRUD operation. But, you must do this with care as you may created unexpected behavior. It is also worth noting that inheriting from classes that have methods that are annotated with #RequestMapping automatically exposes all of the annotated methods. This may be undesirable. For example, we may not want a delete option for categories, but we want it for products. To combat this, instead of annotating the method in the parent class, we can simply define it in the parent class, and override the desired CRUD operations with the added #RequestMapping annotation and then call the super class method.
Another approach is using annotations.
Seems like you want program itself to determine what type the option is.But before you do that,are you sure what is the difference between these two Object?
First is,what is the Option1And2 actually is?If the Option1And2 contains all the field of Option1 and Option2 but it's not the subclass of those,then probably the Option1And2 could be like:
#Data
public class Option1And2{
private String name;
private Long id;
}
If you have other limits like "one of them and only one of them has
to be null",then you could determine it by this rule.
If you don't have any other limitation,then maybe you could add a new
field as a flag.
In fact those code style are not recommend.If those two functions have different responsibilities,then maybe it's better to not mix them together.You will understand what I mean when you have to refactor these code.
If these two functions do have lots of things in common,maybe it's better for you to refactor the service logic instead of just combining two service roughly by creating a new param Option1And2.
By the way,what are you exactly want to do?Why do you want to merge those two object into one?
While working with a project that involves requesting multiple data types from a database I came to a following question:
Lets say I have 2 java classes that correspond to database entities:
Routes
public class Route {
public Route(int n, int region, Date fdate, boolean changed, int points,
int length) {
super();
this.n = n;
this.region = region;
this.fdate = fdate;
this.changed = changed;
this.points = points;
this.length = length;
}
}
Carrier
public class Carrier {
public Carrier(...) {
this.id = src.getId();
this.name = src.getName();
this.instId = src.getInstId();
this.depotId = src.getDepotId();
}
If so, what's the correct approach of creating Dao interfaces and classes? I'm doing it like this -
#Repository
public class CarrierDaoImpl implements CarrierDao{
#Autowired
DataSource dataSource;
public List<Carrier> getAllOrgs() { ... }
}
#Repository
public class RoutesDaoImpl implements RoutesDao {
#Autowired
DataSource dataSource;
public ArrayList<AtmRouteItem> getRoutes(AtmRouteFilter filter) { ... }
}
I'm creating a #Repository DAO for every java class item\db entity and then 2 separate controllers for requests about carriers and routes. Like this:
#RestController
#RequestMapping(path = "/routes")
public class RoutesController {
#Autowired
RoutesDao routesDao;
#GetMapping(value = {"/getRoutes/", "/getRoutes"})
public ArrayList<Route> getRoutes() { ... } }
And same for controller Carriers. Is it correct and if not what's the correct approach?
Sorry for styling issues, that's my first question on stackoverflow :)
I would suggest creating services marked with #Service annotation (i.e. CarrierService interface and CarrierServiceImpl implementation). Than inject them into controllers. Use repositories within services because some database operations will require transactions and a better place for managing transactions are services. Also services can do more specialized job which will require access to multiple repositories so you can inject them. And don’t forget to mark your services with #Transactional annotation.
It's correct to have a DAO for each entity.
When working with JPA repositories you have no choice but to provide the entity. For instance:
public interface FooRepository extends JpaRepository<Foo,Long>{}
Same for the REST controllers, you have to bring together functionalities by object as you do.
You can improve your mapping to be more RESTful. To retrieve all routes, don't specify a path:
#GetMapping
public ArrayList<RouteResource> getRoutes() { ... }
(I never use #GetMapping yet but it should work like that)
And if you want specific route:
#GetMapping("/get/{id}")
public RouteResource getRoute() {...}
You should return resources instead of entities to client.
I'm not sure where to open my Transaction object. Inside the service layer? Or the controller layer?
My Controller basically has two services, let's call them AService and BService. Then my code goes something like:
public class Controller {
public AService aService = new AService();
public BService bService = new BService();
public void doSomething(SomeData data) {
//Transaction transaction = HibernateUtil.getSession().openTransaction();
if (data.getSomeCondition()) {
aService.save(data.getSomeVar1());
bService.save(data.getSomeVar2());
}
else {
bService.save(data.getSomeVar2());
}
//transaction.commit(); or optional try-catch with rollback
}
}
The behavior I want is that if bService#save fails, then I could invoke a transaction#rollback so that whatever was saved in aService would be rolled back as well. This only seems possible if I create one single transaction for both saves.
But looking at it in a different perspective, it looks really ugly that my Controller is dependent on the Transaction. It would be better if I create the Transaction inside the respective services, (something like how Spring #Transactional works), but if I do it that way, then I don't know how to achieve what I want to happen...
EDIT: Fixed code, added another condition. I am not using any Spring dependencies so the usage of #Transactional is out of the question.
You can accomplish what you're asking with another layer of abstraction and using composition.
public class CompositeABService {
#Autowired
private AService aservice;
#Autowired
private BService bservice;
#Transactional
public void save(Object value1, Object value2) {
aservice.save( value1 );
bservice.save( value2 );
}
}
public class AService {
#Transactional
public void save(Object value) {
// joins an existing transaction if one exists, creates a new one otherwise.
}
}
public class BService {
#Transactional
public void save(Object value) {
// joins an existing transaction if one exists, creates a new one otherwise.
}
}
This same pattern is typically used when you need to interact with multiple repositories as a part of a single unit of work (e.g. transaction).
Now all your controller needs to depend upon is CompositeABService or whatever you wish to name it.