I created the following entity class where the primary key is calculated by a Table Generator.
#SuppressWarnings("serial")
#Entity
public class Article implements Serializable {
#Id
#TableGenerator(name = "ARTICLE_TABLE_GEN", table = "sequences", pkColumnName = "seq_name", valueColumnName = "seq_count", pkColumnValue = "ART_SEQ")
#GeneratedValue(strategy = GenerationType.TABLE, generator = "ARTICLE_TABLE_GEN")
private long id;
In the debug log, I read that the generation worked.
[DEBUG] Generated identifier: 200, using strategy: org.hibernate.id.MultipleHiLoPerTableGenerator
The entites are managed by a JpaRepository.
#Repository
public interface IArticleRepository extends JpaRepository<Article, Long> {
List<Article> findByShortTextLike(String shortText);
}
In Java code, I access this repository via a Service.
#Service
public class ArticleService implements IArticleService {
#Autowired
private IArticleRepository articleRepository;
#Override
#Transactional(readOnly = true)
public Article getArticleByID(long id) {
return this.articleRepository.findOne(id);
}
#Override
#Transactional
public Article createArticle(String shortText, String longText,
String packageUnit, double weight, String group, char abcClass) {
if (getArticleByShortText(shortText).size() == 0) {
Article article = new Article();
article.setShortText(shortText);
article.setDescription(longText);
article.setPackageUnit(packageUnit);
article.setWeight(weight);
article.setMaterialGroup(group);
article.setClassABC(abcClass);
this.articleRepository.saveAndFlush(article);
return article;
} else
return null;
}
#Override
#Transactional(readOnly = true)
public List<Article> getArticleByShortText(String short_text) {
return this.articleRepository.findByShortTextLike(short_text);
}
}
After the service-method "createArticle" called the repository to save and flush the new instance to the database, this is perfectly done. However, the generated ID is not written to the object the method returns.
I remember that this was done when I used the AUTO_INCREMENT specification on the database column. Why does this not happen in the new case?
I have no experience with Spring-data-JPA, but the javadoc of saveAndFlush() shows that it returns an entity. So it probably uses EntityManager.merge() internally. Try changing your code to
article = this.articleRepository.saveAndFlush(article);
return article;
Related
I'm trying to implement a reactive(r2dbc) repository in Micronaut but I'm having some problems with the data that is being queried. Those issues don't occur when using non-reactive repositories.
Here's how my reactive repository looks:
#R2dbcRepository(dialect = Dialect.MYSQL)
public interface ReactiveCampaignRepository extends ReactiveStreamsCrudRepository<Campaign, Integer> {
#Override
Flux<Campaign> findAll();
}
And this is how my regular repository looks:
#Repository
public interface CampaignRepository extends CrudRepository<Campaign, Integer> {
}
When invoking findAll method from ReactiveCampaignRepository I'm able to query all entities, however all of them have null ids.
When I invoke findAll from CampaignRepository all entites are queried and Ids are populated correctly.
This is how id field looks in Campaign, which is a remote dependency
#Entity
#Table(name = "campaign")
public class Campaign implements Serializable {
private static final long serialVersionUID = 1L;
private Integer id;
#Id
#GeneratedValue(strategy = GenerationType.IDENTITY)
#Column(name = "id")
public Integer getId() {
return id;
}
public void setId(Integer id) {
this.id = id;
}
This entity is introspected like this:
#Introspected(classes = {Campaign.class})
public class EntitiesConfiguration {
}
I'm new to micronaut and r2dbc so I could be missing something obvious but I cannot figure it out and any pointers would be greatly appreciated.
Thank You
EDIT:
#tmarouane It's just a simple controller to test if things are working as they should.
#Get(value = "/all")
public Flux<CampaignDTO> allCampaigns() {
return reactiveCampaignRepository.findAll().map(CampaignDTO::new);
}
#Get(value = "/all2")
public List<CampaignDTO> allCampaigns2() {
return StreamSupport.stream(campaignRepository.findAll().spliterator(), false).map(CampaignDTO::new).collect(Collectors.toList());
}
and controller
#Produces(MediaType.APPLICATION_JSON)
#Secured(SecurityRule.IS_AUTHENTICATED)
#Controller("/campaign")
public class CampaignController {
private final CampaignRepository campaignRepository;
private final ReactiveCampaignRepository reactiveCampaignRepository;
public CampaignController(CampaignRepository campaignRepository,
ReactiveCampaignRepository reactiveCampaignRepository
) {
this.campaignRepository = campaignRepository;
this.reactiveCampaignRepository = reactiveCampaignRepository;
}
CampaignDTO is just a simple DTO class where just a subset of Campaign's fields are used, with a simple constructor taking Campaign object.
public CampaignDTO(Campaign campaign) {
this.id = campaign.getId();
}
Besides id there's 1 more attribute which is not null but it's own attributes are null which I haven't spotted at first - customer, even though customer_id is populated in objects queried with both reactive and non reactive repos, here's how it looks in Campaign
#JoinColumn(name = "customer_id", referencedColumnName = "customer_id")
#ManyToOne(optional = false)
public Customer getCustomer() {
return customer;
}
public void setCustomer(Customer customer) {
this.customer = customer;
}
This seems to be solved in micronaut 3.0.1 but it doesn't work in 3.0.2
In our spring boot application, I am trying to save an aggregate, that consists of a root entity (ParentEntity) and a Set of child entities (ChildEntity).
The intention is, that all operations are done through the aggreate. So there is no need for a repository for ChildEntity, as the ParentEntity is supposed to manage all save or update operations.
This is how the Entities look like:
#Entity
#Table(name = "tab_parent", schema = "test")
public class ParentEntity implements Serializable {
#Id
#Column(name = "parent_id")
#GeneratedValue(strategy = GenerationType.IDENTITY)
private Integer parentId;
#Column(name = "description")
private String description;
#Column(name = "created_datetime", updatable = false, nullable = false)
#ColumnTransformer(write = "COALESCE(?,CURRENT_TIMESTAMP)")
private OffsetDateTime created;
#Column(name = "last_modified_datetime", nullable = false)
#ColumnTransformer(write = "COALESCE(CURRENT_TIMESTAMP,?)")
private OffsetDateTime modified;
#OneToMany(fetch = FetchType.EAGER, cascade = CascadeType.ALL, orphanRemoval = true, mappedBy = "ParentEntity")
private Set<ChildEntity> children;
// constructor and other getters and setters
public void setChildren(final Set<ChildEntity> children) {
this.children = new HashSet<>(children.size());
for (final ChildEntity child : children) {
this.addChild(child);
}
}
public ParentEntity addChild(final ChildEntity child) {
this.children.add(child);
child.setParent(this);
return this;
}
public ParentEntity removeChild(final ChildEntity child) {
this.children.add(child);
child.setParent(null);
return this;
}
}
#Entity
#DynamicUpdate
#Table(name = "tab_child", schema = "test")
public class ChildEntity implements Serializable {
#Id
#GeneratedValue(strategy = GenerationType.IDENTITY)
#Column(name = "child_id")
private Integer childId;
#Column(name = "language_id")
private String languageId;
#Column(name = "text")
private String text;
#Column(name = "created_datetime", updatable = false, nullable = false)
#ColumnTransformer(write = "COALESCE(?,CURRENT_TIMESTAMP)")
public OffsetDateTime created;
#Column(name = "last_modified_datetime", nullable = false)
#ColumnTransformer(write = "COALESCE(CURRENT_TIMESTAMP,?)")
public OffsetDateTime modified;
#ManyToOne(fetch = FetchType.LAZY)
#JoinColumn(name = "parent_id", updatable = false)
private ParentEntity parent;
// constructor and other getters and setters
public ParentEntity getParent() {
return this.parent;
}
public void setParent(final ParentEntity parent) {
this.parent = parent;
}
}
This is the store method to save or update the entities:
public Integer merge(final ParentDomainObject parentDomainObject) {
final ParentEntity parentEntity =
this.mapper.toParentEntity(parentDomainObject);
final ParentEntity result = this.entityManager.merge(parentEntity);
this.entityManager.flush();
return result.getParentId();
}
And this is the store method to retrieve the aggregate by id:
public Optional<ParentDomainObject> findById(final Integer id) {
return this.repo.findById(id).map(this.mapper::toParentDomainObject);
}
As you can see our architecture strictly separates the store from the service layer. So the service only knows about domain objects and does not depend on Hibernate Entites at all.
When updating either the child or the parent, firstly the parent is loaded. In the service layer, the domain object is updated (fields are set, or a child is added/removed).
Then the merge method (see code snippet) of the store is called with the updated domain object.
This works, but not completely as we want to. Currently every update leads to the parent and EVERY chhild entity being saved, even if all field remained the same. We added the #DynamicUpdate annotaton. Now we saw, that the "modified" field is the problem.
We use a #ColumnTransformer to have the database set the date. Now even if you call the services update method without changing anything, Hibernate generates a update query for EVERY object, which updates only the modified field.
The worst thing about that is, as every object is saved, every modified date changed as well to the current date. But we need information about exactly which object really changed and when.
Is there any way to tell hibernate, that this column should not be taken into account when deciding what to update. However of course, if a field changed, the update operation should indeed update the modified field.
UPDATE:
My second approach after #Christian Beikov mentioned the use of #org.hibernate.annotations.Generated( GenerationTime.ALWAYS )
is the following:
Instead of #Generated (which uses #ValueGenerationType( generatedBy = GeneratedValueGeneration.class )),
I created my own annotations, which use custom AnnotationValueGeneration implementations:
#ValueGenerationType(generatedBy = CreatedTimestampGeneration.class)
#Retention(RetentionPolicy.RUNTIME)
public #interface InDbCreatedTimestamp {
}
public class CreatedTimestampGeneration
implements AnnotationValueGeneration<InDbCreatedTimestamp> {
#Override
public void initialize(final InDbCreatedTimestamp annotation, final Class<?> propertyType) {
}
#Override
public GenerationTiming getGenerationTiming() {
return GenerationTiming.INSERT;
}
#Override
public ValueGenerator<?> getValueGenerator() {
return null;
}
#Override
public boolean referenceColumnInSql() {
return true;
}
#Override
public String getDatabaseGeneratedReferencedColumnValue() {
return "current_timestamp";
}
}
#ValueGenerationType(generatedBy = ModifiedTimestampGeneration.class)
#Retention(RetentionPolicy.RUNTIME)
public #interface InDbModifiedTimestamp {
}
public class ModifiedTimestampGeneration
implements AnnotationValueGeneration<InDbModifiedTimestamp> {
#Override
public void initialize(final InDbModifiedTimestamp annotation, final Class<?> propertyType) {
}
#Override
public GenerationTiming getGenerationTiming() {
return GenerationTiming.ALWAYS;
}
#Override
public ValueGenerator<?> getValueGenerator() {
return null;
}
#Override
public boolean referenceColumnInSql() {
return true;
}
#Override
public String getDatabaseGeneratedReferencedColumnValue() {
return "current_timestamp";
}
}
I use these annotations in my entities instead of the #ColumnTransformer annotations now.
This works flawlessly when I insert a new ChildEntity via addChild(), as now not all timestamps of all entities of the aggregate are updated anymore. Only the timestamps of the new child are set now.
In other words, the InDbCreatedTimestamp works as it should.
Sadly, the InDbModifiedTimestamp does not. Because of GenerationTiming.ALWAYS, I expected the timestamp to be generated on db level, everytime an INSERT OR UPDATE is issued. If I change a field of a ChildEntity and then save the aggregate, an update statement is generated only for this one database row, as expected. However, the last_modified_datetime column is not updated, which is surprising.
It seems that this is unfortunately still an open bug. This issue describes my problem precisely: Link
Can someone provide a solution how to get this db function executed on update as well (without using db triggers)
You could try to use #org.hibernate.annotations.Generated( GenerationTime.ALWAYS ) on these fields and use a database trigger or default expression to create the value. This way, Hibernate will never write the field, but read it after insert/update.
Overall this has a few downsides though (need the trigger, need a select after insert/update), so I think this is a perfect use case for Blaze-Persistence Entity Views.
I created the library to allow easy mapping between JPA models and custom interface or abstract class defined models, something like Spring Data Projections on steroids. The idea is that you define your target structure(domain model) the way you like and map attributes(getters) via JPQL expressions to the entity model.
A DTO/domain model for your use case could look like the following with Blaze-Persistence Entity-Views:
#EntityView(ParentEntity.class)
#UpdatableEntityView
public interface ParentDomainObject {
#IdMapping
Integer getParentId();
OffsetDateTime getModified();
void setModified(OffsetDateTime modified);
String getDescription();
void setDescription(String description);
Set<ChildDomainObject> getChildren();
#PreUpdate
default preUpdate() {
setModified(OffsetDateTime.now());
}
#EntityView(ChildEntity.class)
#UpdatableEntityView
interface ChildDomainObject {
#IdMapping
Integer getChildId();
String getName();
}
}
Querying is a matter of applying the entity view to a query, the simplest being just a query by id.
ParentDomainObject a = entityViewManager.find(entityManager, ParentDomainObject.class, id);
The Spring Data integration allows you to use it almost like Spring Data Projections: https://persistence.blazebit.com/documentation/entity-view/manual/en_US/index.html#spring-data-features
Page<ParentDomainObject> findAll(Pageable pageable);
The best part is, it will only fetch the state that is actually necessary! It also supports writing/mapping back to the persistence model in an efficient manner. Since it does dirty tracking for you, it will only flush changes if the object is actually dirty.
public Integer merge(final ParentDomainObject parentDomainObject) {
this.entityViewManager.save(this.entityManager, parentDomainObject);
this.entityManager.flush();
return parentDomainObject.getParentId();
}
Hibernate 4.3.11
I have an issue saving the following object graph in hibernate. The Employer is being saved using the merge() method.
Employer
|_ List<EmployerProducts> employerProductsList;
|_ List<EmployerProductsPlan> employerProductsPlan;
The Employer & EmployerProducts have a auto generated pk. The EmployerProductsPlan is a composite key consisting of the EmployerProducts id and a String with the plan code.
The error occurs when there is a transient object in the EmployerProducts list that cascades to List<EmployerProductsPlan>. The 1st error that I encountered which I have been trying to get past was an internal hibernate NPE. This post here perfectly describes the issue that I am having which causes the null pointer Hibernate NullPointer on INSERTED id when persisting three levels using #Embeddable and cascade
The OP left a comment specifying what they did to resolve, but I end up with a different error when changing to the suggested mapping. After changing the mapping, I am now getting
org.hibernate.NonUniqueObjectException: A different object with the same identifier value was already associated with the session : [com.webexchange.model.EmployerProductsPlan#com.webexchange.model.EmployerProductsPlanId#c733f9bd]
Due to other library dependencies, I cannot upgrade above 4.3.x at this time. This project is using spring-boot-starter-data-jpa 1.3.3. No other work is being performed on the session other than calling merge() and passing the employer object.
Below is the mappings for each class:
Employer
#Entity
#Table(name = "employer")
#lombok.Getter
#lombok.Setter
#lombok.EqualsAndHashCode(of = {"employerNo"})
public class Employer implements java.io.Serializable {
#Id
#GeneratedValue(strategy = IDENTITY)
#Column(name = "EMPLOYER_NO", unique = true, nullable = false)
private Long employerNo;
.....
#OneToMany(cascade = CascadeType.ALL, fetch = FetchType.LAZY, mappedBy = "employer", orphanRemoval = true)
private List<EmployerProducts> employerProductsList = new ArrayList<>(0);
}
EmployerProducts
#Entity
#Table(name = "employer_products")
#Accessors(chain = true) // has to come before #Getter and #Setter
#lombok.Getter
#lombok.Setter
#lombok.EqualsAndHashCode(of = {"employerProductsNo"})
public class EmployerProducts implements Serializable {
#Id
#GeneratedValue(strategy = IDENTITY)
#Column(name = "employer_products_no", unique = true, nullable = false)
private Long employerProductsNo;
#ManyToOne
#JoinColumn(name = "employer_no", nullable = false)
private Employer employer;
......
#OneToMany(cascade = CascadeType.ALL, fetch = FetchType.LAZY, mappedBy = "employerProducts", orphanRemoval = true)
private List<EmployerProductsPlan> employerProductsPlanList = new ArrayList<>(0);
}
EmployerProductsPlan
#Accessors(chain = true) // has to come before #Getter and #Setter
#lombok.Getter
#lombok.Setter
#lombok.EqualsAndHashCode(of = {"id"})
#Entity
#Table(name="employer_products_plan")
public class EmployerProductsPlan implements Serializable {
#EmbeddedId
#AttributeOverrides({ #AttributeOverride(name = "plan", column = #Column(name = "epp_plan", nullable = false)),
#AttributeOverride(name = "employerProductsNo", column = #Column(name = "employer_products_no", nullable = false)) })
private EmployerProductsPlanId id;
#ManyToOne
#JoinColumn(name = "employer_products_no")
#MapsId("employerProductsNo")
private EmployerProducts employerProducts;
}
I am populating the employerProducts above with the same instance of the EmployerProducts object that is being saved. It is transient and has no id populated as it does not existing in the db yet.
EmployerProductsPlanId
#Accessors(chain = true) // has to come before #Getter and #Setter
#lombok.Getter
#lombok.Setter
#lombok.EqualsAndHashCode(of = {"plan", "employerProductsNo"})
#Embeddable
public class EmployerProductsPlanId implements Serializable {
private String plan;
private Long employerProductsNo;
// This was my previous mapping that was causing the internal NPE in hibernate
/* #ManyToOne
#JoinColumn(name = "employer_products_no")
private EmployerProducts employerProducts;*/
}
UPDATE:
Showing struts controller and dao. The Employer object is never loaded from the db prior to the save. Struts is creating this entire object graph from the Http request parameters.
Struts 2.5 controller
#lombok.Getter
#lombok.Setter
public class EditEmployers extends ActionHelper implements Preparable {
#Autowired
#lombok.Getter(AccessLevel.NONE)
#lombok.Setter(AccessLevel.NONE)
private IEmployerDao employerDao;
private Employer entity;
....
public String save() {
beforeSave();
boolean newRecord = getEntity().getEmployerNo() == null || getEntity().getEmployerNo() == 0;
Employer savedEmployer = newRecord ?
employerDao.create(getEntity()) :
employerDao.update(getEntity());
setEntity(savedEmployer);
return "success";
}
private void beforeSave() {
Employer emp = getEntity();
// associate this employer record with any products attached
for (EmployerProducts employerProduct : emp.getEmployerProductsList()) {
employerProduct.setEmployer(emp);
employerProduct.getEmployerProductsPlanList().forEach(x ->
x.setEmployerProducts(employerProduct));
}
// check to see if branding needs to be NULL. It will create the object from the select parameter with no id
// if a branding record has not been selected
if (emp.getBranding() != null && emp.getBranding().getBrandingNo() == null) {
emp.setBranding(null);
}
}
}
Employer DAO
#Repository
#Transactional
#Service
#Log4j
public class EmployerDao extends WebexchangeBaseDao implements IEmployerDao {
private Criteria criteria() {
return getCurrentSession().createCriteria(Employer.class);
}
#Override
#Transactional(readOnly = true)
public Employer read(Serializable id) {
return (Employer)getCurrentSession().load(Employer.class, id);
}
#Override
public Employer create(Employer employer) {
getCurrentSession().persist(employer);
return employer;
}
#Override
public Employer update(Employer employer) {
getCurrentSession().merge(employer);
return employer;
}
}
As of right now, my solution is to loop through the EmployerProducts and check for new records. I called a persist on the new ones before calling the merge() on the parent Employer. I also moved the logic I had associating all the keys into the dao instead of having it in my Struts action. Below is what my update() method in the Employer DAO now looks like
public Employer update(Employer employer) {
// associate this employer record with any products attached
for (EmployerProducts employerProduct : employer.getEmployerProductsList()) {
employerProduct.setEmployer(employer);
if (employerProduct.getEmployerProductsNo() == null) {
// The cascade down to employerProductsPlanList has issues getting the employerProductsNo
// automatically if the employerProduct does not exists yet. Persist the new employer product
// before we try to insert the new composite key in the plan
// https://stackoverflow.com/questions/54517061/hibernate-4-3-cascade-merge-through-multiple-lists-with-embeded-id
List<EmployerProductsPlan> plansToBeSaved = employerProduct.getEmployerProductsPlanList();
employerProduct.setEmployerProductsPlanList(new ArrayList<>());
getCurrentSession().persist(employerProduct);
// add the plans back in
employerProduct.setEmployerProductsPlanList(plansToBeSaved);
}
// associate the plan with the employer product
employerProduct.getEmployerProductsPlanList().forEach(x ->
x.getId().setEmployerProductsNo(employerProduct.getEmployerProductsNo())
);
}
return (Employer)getCurrentSession().merge(employer);
}
I have tables with structure:
orders
- id: bigint(20)
- amount: bigint(20)
order_details
- id: bigint(20)
- payment_type: varchar(255)
- order_fk: bigint(20)
Entities:
MyOrderEntity
#Entity
#Table(name = "orders")
public class MyOrderEntity {
#Id
#GeneratedValue(strategy = IDENTITY)
public Long id;
public Long amount;
#OneToOne(fetch = LAZY, mappedBy = "order", cascade = ALL)
public MyOrderDetailsEntity details;
}
MyOrderDetailsEntity
#Entity
#Table(name = "order_details")
public class MyOrderDetailsEntity {
#Id
#GeneratedValue(strategy = IDENTITY)
public Long id;
#OneToOne
#JoinColumn(name = "order_fk")
public MyOrderEntity order;
public String paymentType;
}
Repository:
#Repository
public interface MyOrderRepository extends JpaRepository<MyOrderEntity, Long> {}
I'm persisting MyOrderEntity in such way:
MyOrderDetailsEntity details = new MyOrderDetailsEntity();
details.paymentType = "default";
MyOrderEntity order = new MyOrderEntity();
order.amount = 123L;
order.details = details;
myOrderRepository.save(order);
After order saving I have null value in order_details.order_fk field.
I want that order_details.order_fk will be filled by order.id.
How can I do this?
You need also to explicitly set the MyOrderEntity to MyOrderDetailsEntity. JPA implementation does not do it for you. So add line:
details.order = order;
before save.
You can also add following method to MyOrderEntity:
#PrePersist
private void prePersist() {
if(null!=details) details.order=this;
}
to avoid boilerplate code everywhere you set the MyOrderDetailsEntity to MyOrderEntity.
But the best way is to set MyOrderDetailsEntity.details field private and create a setter like:
setDetails(MyOrderDetailsEntity details) {
this.details = details;
details.order = this;
}
to keep it always set correctly, even before persisting. Best strategy depends on the case.
See this question and answers for more details.
I have written my own IdGenerator:
public class AkteIdGenerator implements IdentifierGenerator {
public Serializable generate(SessionImplementor session, Object object)
throws HibernateException {
// if custom id is set -> use this id
if (object instanceof SomeBean) {
SomeBean someBean = (SomeBean) object;
Long customId = someBean.getCustomId();
if (customId != 0) {
return customId;
}
}
// otherwise --> call the SequenceGenerator manually
SequenceStyleGenerator sequenceGenerator ...
}
}
Does anyone know how I could call the sequenceGenerator from my generator class what I normally can define per annotations:
#GeneratedValue(
strategy = GenerationType.SEQUENCE,
generator = "MY_SEQUENCE")
#SequenceGenerator(
allocationSize = 1,
name = "MY_SEQUENCE",
sequenceName = "MY_SEQUENCE_NAME")
I would be very thankful for any solutions!!!!
Thanks a lot, Norbert
You can eassly call the SequenceGenerator from your Generator class. By writing this code.
THe Custom generator class should be
public class StudentNoGenerator implements IdentifierGenerator {
public Serializable generate(SessionImplementor session, Object object)throws HibernateException {
SequenceGenerator generator=new SequenceGenerator();
Properties properties=new Properties();
properties.put("sequence","Stud_NoSequence");
generator.configure(Hibernate.STRING, properties, session.getFactory().getDialect());
return generator.generate(session, session);
}
}
In the above code Stud_NoSequence is the Sequence name, which shoulb be created. in Data base by wring create sequence Stud_NoSequence;
Hibernate.String is the type which will be return by the SequenceGenerator class.
and the domain class will be
import javax.persistence.Column;
import javax.persistence.Entity;
import javax.persistence.GeneratedValue;
import javax.persistence.Id;
#Entity
#org.hibernate.annotations.GenericGenerator(
name = "Custom-generator",
strategy = "com.ssis.id.StudentNoGenerator"
)
public class Student {
#Id #GeneratedValue(generator = "Custom-generator")
String rno;
#Column
String name;
public String getRno() {
return rno;
}
public void setRno(String rno) {
this.rno = rno;
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
}
#Id
#GenericGenerator(name = "seq_id", strategy = "de.generator.AkteIdGenerator")
#GeneratedValue(generator = "seq_id")
#Column(name = "ID")
private Integer Id;
http://blog.anorakgirl.co.uk/2009/01/custom-hibernate-sequence-generator-for-id-field/
Not sure if this helps, but I kept coming across this post while searching for my answer, which I didn't find posted anywhere, but found a solution myself. So I thought this might be the best place to share.
If you are using hibernate as the JPA provider, you can manually call an ID generator assigned to a given entity class. First inject the JpaContext:
#Autowired
org.springframework.data.jpa.repository.JpaContext jpaContext;
Then obtain the internal org.hibernate.id.IdentifierGenerator with this:
org.hibernate.engine.spi.SessionImplementor session = jpaContext.getEntityManagerByManagedType(MyEntity.class).unwrap(org.hibernate.engine.spi.SessionImplementor.class);
org.hibernate.id.IdentifierGenerator generator = session.getEntityPersister(null, new MyEntity()).getIdentifierGenerator();
Now you can obtain an ID from the generator programatically:
Serializable id = generator.generate(session, new MyEntity());
Your post was helpful to update the name of the sequence.
Because I use a sequence per month, and the configuration does not update each identifier generation.
Here is my code:
#Override
public Serializable generate(SessionImplementor sessionImplementator,
Object object) throws HibernateException {
Calendar now = Calendar.getInstance();
// If month sequence is wrong, then reconfigure.
if (now.get(Calendar.MONTH) != SEQUENCE_DATE.get(Calendar.MONTH)) {
super.configure(new LongType(), new Properties(),
sessionImplementator.getFactory().getDialect());
}
Long id = (Long) super.generate(sessionImplementator, object);
String sId = String.format("%1$ty%1$tm%2$06d", SEQUENCE_DATE, id);
return Long.parseLong(sId);// 1301000001
}