Hibernate/SpringData : Incorrect dirty check on field with AttributeConverter - java

I have below entity with custom AttributeConverter which saves field in the DB as binary data.
TaskEntity.java
#Entity
#Table(name = "task")
public class TaskEntity {
#Id
#GeneratedValue
#Column(name = "id", nullable = false)
private UUID id;
#Column(name = "state_machine_context")
#Convert(converter = StateMachineContextConverter.class)
private StateMachineContext<State, Event> stateMachineContext;
}
StateMachineContextConverter.java
#Converter
public class StateMachineContextConverter
implements AttributeConverter<StateMachineContext, byte[]> {
private static final ThreadLocal<Kryo> kryoThreadLocal = ThreadLocal.withInitial(() -> {
Kryo kryo = new Kryo();
kryo.addDefaultSerializer(StateMachineContext.class, new StateMachineContextSerializer());
kryo.addDefaultSerializer(MessageHeaders.class, new MessageHeadersSerializer());
kryo.addDefaultSerializer(UUID.class, new UUIDSerializer());
return kryo;
});
private static final int BUFFER_SIZE = 4096;
private static final int MAX_BUFFERED_SIZE = 10240;
#Override
public byte[] convertToDatabaseColumn(final StateMachineContext attribute) {
return serialize(attribute);
}
#Override
public StateMachineContext convertToEntityAttribute(final byte[] dbData) {
return deserialize(dbData);
}
private byte[] serialize(final StateMachineContext context) {
if (context == null) {
return null;
}
try (Output output = new Output(BUFFER_SIZE, MAX_BUFFERED_SIZE)) {
final Kryo kryo = kryoThreadLocal.get();
kryo.writeObject(output, context);
return output.toBytes();
}
}
private StateMachineContext deserialize(final byte[] data) {
if (data == null || data.length == 0) {
return null;
}
final Kryo kryo = kryoThreadLocal.get();
try (Input input = new Input(data)) {
return kryo.readObject(input, StateMachineContext.class);
}
}
}
So after TaskEntity is selected with SpringData nativeQuery within method with #Transactional annotation UPDATE queries are fired for all retrieved entities.
After investigation I suppose it is happened because of dirty checking of hibernate, as context field is converted from byte[] and for some reasons it is considered dirty by hibernate.
The interesting thing is that making #Transactional(readOnly=true) does not help as Postgres is throwing exception "Could not UPDATE in readOnly transaction" but if I remove #Transactional annotation completely everything works fine and UPDATE queries are not fired after select.
What is the best solution to fix this issue? Maybe it is possible to disable dirty checking for readOnly transactions? Is it possible to rewrite hibernate dirty check for one field? I found that it is possible to overwrite dirty checking completely but I would prefer not to do this.

I faced the same issue when I was using a convertor to convert my JSON field from DB to my custom class.
The Dirty checking policy of Hibernate calls the .equals method on the entity from the Persistent Context (which is saved as soon as you fetch an object from DB) and your current Entity.
So overriding the .equals method of StateMachineContext should do it for you. It actually worked for me.
For reference : https://medium.com/#paul.klingelhuber/hibernate-dirty-checking-with-converted-attributes-1b6d1cd27f68

Related

Create mapped entity when you only have the id

I'm not sure how to phrase the question title to be honest, if someone has a suggestion, please let me know.
My use case is this, I have an entity with an account property like so (this is cleaned up to avoid clutter):
#Entity
#Table(name = "report_line", schema = "public")
public class ReportLine extends BaseReportLine {
#ManyToOne
#JoinColumn(name = "report_id")
private Report report;
#ManyToOne
#JoinColumn(name = "account_id")
private Account account;
}
But a DTO that only has an account id / different properties:
public class ImportLineDto {
public String groupName;
public Integer position;
public Integer parentPosition;
public String accountId;
public String name;
public BigDecimal amount;
public List<ImportLineDto> lines = new ArrayList<>();
}
I need to go through / flatten all lines so I can save it to a JPA repository, but there are 2 issues:
Is there a way to create the table line object using the accountId only, without having to look up the account for each line, as that will add a massive amount of unnecessary db calls.
What should I do with the 'lines' on each table object after flattening? Should I set them to null / empty list?
Is there a better way to do this? For once I can actually make changes to the code
Here is what I have so far:
private void saveReport(ImportedResult result) {
Report report = new Report();
...
report.setLines(getReportLinesFromDtoLines(result.lineItems.lines));
ReportRepository.saveAndFlush(report);
}
private List<ReportLine> getReportLinesFromDtoLines(ImportLineDto lines) {
List<ImportLineDto> flatLines = flatMapRecursive(lines).collect(Collectors.toList());
List<ReportLine> reportLines = new ArrayList<>();
for(ImportLineDto line: flatLines) {
ReportLine reportLine = new ReportLine();
reportLine.setItemText(line.name);
reportLine.setAmount(line.amount);
reportLine.setAccount(???);
// how do I set the 'Account' property using the id only, without looking up each account?
reportLines.add(reportLine);
}
return ReportLines;
}
public Stream<ImportLineDto> flatMapRecursive(ImportLineDto item) {
if (item.lines == null) {
return Stream.empty();
}
return Stream.concat(Stream.of(item), item.lines.stream()
.flatMap(this::flatMapRecursive));
}
Follow up:
Just to throw a wrench in there, what if the DTO accountId was not the actual "id" field in the table, but another custom field, I have another situation like that, would it even be possible? I still need the answer the the 1st question however with a standard id.
you may use entityManager.getReference as explained here
reportLine.setAccount(entityManager.getReference(Account.class, line.accountId));

Serializing List to JSON Array

I have a JPA entity with a List of custom objects as one of its fields. Using a Jackson converter, I've managed to persist this list as a JSON array into a MySQL database, but Iam unable to insert into this list after its initial creation.
I can successfully retrieve the existing list, add a new object in memory(and test that it has been inserted), then save it via a Spring REST repository. However, it never seems to persist. Any ideas? Here is my code (this is a Spring Boot project FYI):
Candidate entity with a List inside
#Entity
#Table(name = "Candidates", schema = "Candidate")
public class Candidate extends ResourceSupport {
#Id
#Column(name = "CandidateID")
private Long candidateID;
// More fields
#Column(name = "Fields")
#Convert(converter = CollectionConverter.class)
private List<CandidateField> fields;
//Getters & setters
}
CandidateField class which makes up the List above. The CandidateField is simply a POJO that models the JSON stored in a single field in the Candidate table, it is not an independent entity.
public class CandidateField {
private Long fieldID;
private String name;
private boolean current;
public CandidateField () {
}
public CandidateField (Long fieldID, String name, boolean current) {
this.fieldID = fieldID;
this.name = name;
this.current = current;
}
//Getters & Setters
}
Converter
public class CollectionConverter implements AttributeConverter<List<CandidateField>, String> {
private ObjectMapper objectMapper = new ObjectMapper();
#Override
public String convertToDatabaseColumn(List<CandidateField> object) {
try {
return objectMapper.writeValueAsString(object);
} catch (JsonProcessingException e) {
e.printStackTrace();
return "";
}
}
#Override
public List<CandidateField> convertToEntityAttribute(String data) {
try {
return objectMapper.readValue(data, new TypeReference<List<CandidateField>>() {});
} catch (Exception e) {
e.printStackTrace();
return null;
}
}
Code that persists to database
public void addField(Long fieldID, Long candidateID) {
Candidate candidate = repository.findOne(candidateID);
candidate.getFields().add(new CandidateField(fieldID, "", true));
repository.saveAndFlush(candidate);
}
Repository
#RepositoryRestResource
public interface CandidateRepository extends JpaRepository<Candidate,Long>{}
I can't seem to figure out why this won't persist. Any help will be very much appreciated. Cheers!
Consider defining the cascade type for your collection.
When you persist your Candidate objects the operation is not cascaded by default and thus you need to define it yourself unless you persist your CandidateField objects directly.

JPA Stored Procedure result set mappings and NonUniqueResultException

I'm fairly new to JPA and am trying to use a stored procedure to run a query and map its results to my java classes.
Here are the tables
CREATE TABLE dbo.Branding
(
Branding_ID INT IDENTITY NOT NULL
CONSTRAINT PK_Branding PRIMARY KEY CLUSTERED,
BrandingType_ID INT,
Reseller_ID INT NULL,
Host VARCHAR(MAX) NULL
)
CREATE TABLE dbo.BrandingResource
(
BrandingResource_ID INT IDENTITY NOT NULL
CONSTRAINT PK_BrandingResource PRIMARY KEY CLUSTERED,
Branding_ID INT NOT NULL,
Name VARCHAR(255) NOT NULL,
[Value] VARCHAR(MAX) NOT NULL
)
CREATE TABLE dbo.BrandingType
(
BrandingType_ID INT IDENTITY NOT NULL
CONSTRAINT PK_BrandingType PRIMARY KEY CLUSTERED,
Description VARCHAR(255)
)
Here are the entities:
#Table(name = "[Branding]")
#Entity
public class Branding extends CommonDomainBase
{
#Id
#Column(name = "branding_id")
private int id;
#OneToOne(optional = false)
#JoinColumn(name = "brandingtype_id", nullable = false)
private BrandingType type;
#OneToMany(fetch = FetchType.EAGER)
#JoinColumn(name = "branding_id", referencedColumnName = "branding_id")
private Set<BrandingResource> brandingResources;
#Column(name = "reseller_id", nullable = true)
private Integer resellerId;
#Column(name = "host", nullable = true)
private String host;
}
#Table(name = "[BrandingResource]")
#Entity
public class BrandingResource extends CommonDomainBase
{
#Id
#Column(name = "BrandingResource_Id")
private int id;
#Column(name = "Name")
private String name;
#Column(name = "Value")
private String value;
}
#Table(name = "[BrandingType]")
#Entity
public class BrandingType extends CommonDomainBase
{
#Id
#Column(name = "brandingtype_id")
private int id;
#Column(name = "description")
private String description;
}
I already know that the annotations on the entities are working correctly. When I use Spring Data JPA repositories to query the 3 tables to find one or find all of Branding, I get a generated query which retrieves all 3 tables in a single query.
I am now trying to extends this to allow me to do the same sort of result set mapping using a named stored procedure which I've configured in the following way:
#NamedStoredProcedureQuery(name = "Branding.getBrandingByHost", procedureName = "spGetBrandingByHost", parameters =
{ #StoredProcedureParameter(mode = ParameterMode.IN, name = "host", type = String.class) }, resultSetMappings =
{ "BrandingResults" })
#SqlResultSetMapping(name = "BrandingResults", entities =
{ #EntityResult(entityClass = Branding.class) })
The stored procedure is returning duplicate rows for each row in the branding table, due to the one to many relationship to BrandingResource.
The result set mapping when using the Spring Data JPA repository and it's generated query has duplicate rows in the same way as my procedure, and is able to handle this perfectly when mapping to the objects. When using the named stored procedure however I get the following exception:
javax.persistence.NonUniqueResultException: Call to stored procedure [spGetBrandingByHost] returned multiple results
I understand that I will probably need to include more result set mappings for this to work, but cannot find a example which demonstrates anything similar. Is what I'm after even possible?
Thanks in advance
In answer to my own question, no you can't. Which does make sense. When automatically generating queries, hibernate what column names to expect in the result set, including any that are duplicated from a one to many/many to one relationship. A stored procedure can return any column that hibernate does not know to expect, so setting them explicitly is required.
After much digging I did find the class org.hibernate.cfg.annotations.ResultsetMappingSecondPass which is called to map JPA annotations to native hibernate a org.hibernate.engine.ResultSetMappingDefinition and, reading the source code, I can see it completely ignores most of the annotations for columns, and joining.
It would be great if #NamedStoredProcedureQuery could support one to many/many to one joins. For now I've created my own solution:
public class EntityResultSetSecondPass implements QuerySecondPass
{
private static final String ALIAS = EntityResultSetSecondPass.class.getName() + "_alias";
private final InFlightMetadataCollector metadataCollector;
private int entityAliasIndex;
private final Map<Class<?>, String> aliasMap = new ConcurrentHashMap<>();
public EntityResultSetSecondPass(final InFlightMetadataCollector metadataCollector)
{
this.metadataCollector = metadataCollector;
}
#Override
public void doSecondPass(final Map persistentClasses) throws MappingException
{
for (final Object key : persistentClasses.keySet())
{
final String className = key.toString();
try
{
final Class<?> clazz = Class.forName(className);
final EntityResultSet entityResultSet = clazz.getDeclaredAnnotation(EntityResultSet.class);
if (entityResultSet == null)
{
continue;
}
else
{
createEntityResultDefinition(entityResultSet, clazz);
}
}
catch (final ClassNotFoundException e)
{
throw new HibernateException(e);
}
}
}
private void createEntityResultDefinition(final EntityResultSet entityResultSet, final Class<?> entityClass)
throws ClassNotFoundException
{
final List<NativeSQLQueryReturn> mappedReturns = new ArrayList<>();
final ResultSetMappingDefinition definition = new ResultSetMappingDefinition(entityResultSet.name());
final Map<Class<?>, FieldResult[]> returnedEntities = new ConcurrentHashMap<>();
returnedEntities.put(entityClass, entityResultSet.fields());
for (final EntityResult entityResult : entityResultSet.relatedEntities())
{
returnedEntities.put(entityResult.entityClass(), entityResultSet.fields());
}
definition.addQueryReturn(new NativeSQLQueryRootReturn(getOrCreateAlias(entityClass), entityClass.getName(),
getPropertyResults(entityClass, entityResultSet.fields(), returnedEntities, mappedReturns, ""),
LockMode.READ));
for (final EntityResult entityResult : entityResultSet.relatedEntities())
{
definition
.addQueryReturn(
new NativeSQLQueryRootReturn(getOrCreateAlias(entityResult.entityClass()),
entityResult.entityClass().getName(), getPropertyResults(entityResult.entityClass(),
entityResult.fields(), returnedEntities, mappedReturns, ""),
LockMode.READ));
}
for (final NativeSQLQueryReturn mappedReturn : mappedReturns)
{
definition.addQueryReturn(mappedReturn);
}
metadataCollector.addResultSetMapping(definition);
}
private Map<String, String[]> getPropertyResults(final Class<?> entityClass, final FieldResult[] fields,
final Map<Class<?>, FieldResult[]> returnedEntities, final List<NativeSQLQueryReturn> mappedReturns,
final String prefix) throws ClassNotFoundException
{
final Map<String, String[]> properties = new ConcurrentHashMap<>();
for (final Field field : entityClass.getDeclaredFields())
{
final Column column = field.getAnnotation(Column.class);
if (column != null)
{
properties.put(prefix + field.getName(), new String[]
{ column.name() });
}
final JoinColumn joinColumn = field.getAnnotation(JoinColumn.class);
if (joinColumn != null)
{
properties.putAll(handleJoinColumn(entityClass, field, joinColumn, returnedEntities, mappedReturns));
}
}
if (entityClass.getSuperclass() != null)
{
properties.putAll(
getPropertyResults(entityClass.getSuperclass(), fields, returnedEntities, mappedReturns, prefix));
}
return properties;
}
private Map<String, String[]> handleJoinColumn(final Class<?> sourceEntity, final Field field,
final JoinColumn joinColumn, final Map<Class<?>, FieldResult[]> returnedEntities,
final List<NativeSQLQueryReturn> mappedReturns) throws ClassNotFoundException
{
final Map<String, String[]> properties = new ConcurrentHashMap<>();
final OneToOne oneToOne = field.getAnnotation(OneToOne.class);
if (oneToOne != null)
{
properties.put(field.getName(), new String[]
{ joinColumn.name() });
}
final OneToMany oneToMany = field.getAnnotation(OneToMany.class);
if (oneToMany != null)
{
Class<?> fieldType;
if (field.getType().isArray())
{
fieldType = field.getType();
}
else if (Collection.class.isAssignableFrom(field.getType()))
{
fieldType = Class.forName(
ParameterizedType.class.cast(field.getGenericType()).getActualTypeArguments()[0].getTypeName());
}
else
{
throw new UnsupportedOperationException("One to many only supports collection and array types");
}
if (returnedEntities.keySet().contains(fieldType))
{
properties.put(field.getName(), new String[]
{ joinColumn.name() });
final Map<String, String[]> resolvedProperties = getPropertyResults(fieldType,
returnedEntities.get(fieldType), returnedEntities, mappedReturns, "element.");
resolvedProperties.put("key", new String[]
{ joinColumn.referencedColumnName() });
resolvedProperties.put("element", new String[]
{ joinColumn.name() });
mappedReturns.add(new NativeSQLQueryCollectionReturn(getOrCreateAlias(fieldType),
sourceEntity.getName(), field.getName(), resolvedProperties, LockMode.READ));
mappedReturns
.add(new NativeSQLQueryJoinReturn(getOrCreateAlias(fieldType),
getOrCreateAlias(sourceEntity), field.getName(), getPropertyResults(fieldType,
returnedEntities.get(fieldType), returnedEntities, mappedReturns, ""),
LockMode.READ));
}
}
return properties;
}
private String getOrCreateAlias(final Class<?> entityClass)
{
if (!aliasMap.containsKey(entityClass))
{
aliasMap.put(entityClass, ALIAS + entityAliasIndex++);
}
return aliasMap.get(entityClass);
}
}
and the accompanying annotation:
#Target(ElementType.TYPE)
#Retention(RetentionPolicy.RUNTIME)
public #interface EntityResultSet
{
/**
* The name of the result set
*
* #return
*/
String name();
/**
* The {#link FieldResult} to override those of the {#link Column}s on the
* current {#link Entity}
*
* #return
*/
FieldResult[] fields() default {};
/**
* The {#link EntityResult} that define related {#link Entity}s that are
* included in this result set.
*
* </p>Note: discriminatorColumn has no impact in this usage
*
* #return
*/
EntityResult[] relatedEntities() default {};
}
This is all registered with hibernate via a MetadataContributor
The code is a bit of a mess, but it is actually working. It basically looks for #EntityResultSet in which the entities for a particular result set are defined. The EntityResultSetSecondPass looks at these given entities and generates a ResultSetMappingDefinition which includes all the joining meta data for collection mapping. It runs from all the standard column annotations but can be overridden with FieldResult defined in #EntityResultSet
It seems a bit nasty, but it's working nicely.

ElasticSearch index

ElasticSearch makes index for new records created by UI,but the records created by liquibase file not indexed so it don't appears in search result,ElasticSearch should index all records created by UI and liquibase files,Is there any process for indexing the records in liquibase files.
Liquibase only makes changes to your database. Unless you have some process which listens to the database changes and then updates Elasticsearch, you will not see the changes.
There might be multiple ways to get your database records into Elasticsearch:
Your UI probably calls some back-end code to index a create or an update into Elasticsearch already
Have a batch process which knows which records are changed (e.g. use an updated flag column or a updated_timestamp column) and then index those into Elasticsearch.
The second option can either be done in code using a scripting or back-end scheduled job or you might be able to use Logstash with the jdbc-input plugin.
As Sarwar Bhuiyan and Mogsdad sad
Unless you have some process which listens to the database changes and
then updates Elasticsearch
You can use liquibase to populate elasticsearch(this task will be executed once, just like normal migration). To do this you need to create a customChange:
<customChange class="org.test.ElasticMigrationByEntityName">
<param name="entityName" value="org.test.TestEntity" />
</customChange>
In that java based migration you can call the services you need. Here is an example of what you can do (please do not use code from this example in a production).
public class ElasticMigrationByEntityName implements CustomTaskChange {
private String entityName;
public String getEntityName() {
return entityName;
}
public void setEntityName(String entityName) {
this.entityName = entityName;
}
#Override
public void execute(Database database) {
//We schedule the task for the next execution. We are waiting for the context to start and we get access to the beans
DelayedTaskExecutor.add(new DelayedTask(entityName));
}
#Override
public String getConfirmationMessage() {
return "OK";
}
#Override
public void setUp() throws SetupException {
}
#Override
public void setFileOpener(ResourceAccessor resourceAccessor) {
}
#Override
public ValidationErrors validate(Database database) {
return new ValidationErrors();
}
/* ===================== */
public static class DelayedTask implements Consumer<ApplicationContext> {
private final String entityName;
public DelayedTask(String entityName) {
this.entityName = entityName;
}
#Override
public void accept(ApplicationContext applicationContext) {
try {
checkedAccept(applicationContext);
} catch (Exception e) {
throw new RuntimeException(e);
}
}
//We're going to find beans by name (the most controversial point)
private void checkedAccept(ApplicationContext context) throws ClassNotFoundException {
Class entityClass = Class.forName(entityName);
String name = entityClass.getSimpleName();
//Please do not use this code in production
String repositoryName = org.apache.commons.lang3.StringUtils.uncapitalize(name + "Repository");
String repositorySearchName = org.apache.commons.lang3.StringUtils.uncapitalize(name + "SearchRepository");
JpaRepository repository = (JpaRepository) context.getBean(repositoryName);
ElasticsearchRepository searchRepository = (ElasticsearchRepository) context.getBean(repositorySearchName);
//Doing our work
updateData(repository, searchRepository);
}
//Write your logic here
private void updateData(JpaRepository repository, ElasticsearchRepository searchRepository) {
searchRepository.saveAll(repository.findAll());
}
}
}
Because the beans have not yet been created, we will have to wait for them
#Component
public class DelayedTaskExecutor {
#Autowired
private ApplicationContext context;
#EventListener
//We are waiting for the app to launch
public void onAppReady(ApplicationReadyEvent event) {
Queue<Consumer<ApplicationContext>> localQueue = getQueue();
if(localQueue.size() > 0) {
for (Consumer<ApplicationContext> consumer = localQueue.poll(); consumer != null; consumer = localQueue.poll()) {
consumer.accept(context);
}
}
}
public static void add(Consumer<ApplicationContext> consumer) {
getQueue().add(consumer);
}
public static Queue<Consumer<ApplicationContext>> getQueue() {
return Holder.QUEUE;
}
private static class Holder {
private static final Queue<Consumer<ApplicationContext>> QUEUE = new ConcurrentLinkedQueue();
}
}
An entity example:
#Entity
#Table(name = "test_entity")
#Document(indexName = "testentity")
public class TestEntity implements Serializable {
private static final long serialVersionUID = 1L;
#Id
#Field(type = FieldType.Keyword)
#GeneratedValue(generator = "uuid")
#GenericGenerator(name = "uuid", strategy = "uuid2")
private String id;
#NotNull
#Column(name = "code", nullable = false, unique = true)
private String code;
...
}

Bypass GeneratedValue in Hibernate (merge data not in db?)

My problem is the same as described in [1] or [2]. I need to manually set a by default auto-generated value (why? importing old data). As described in [1] using Hibernate's entity = em.merge(entity) will do the trick.
Unfortunately for me it does not. I neither get an error nor any other warning. The entity is just not going to appear in the database. I'm using Spring and Hibernate EntityManager 3.5.3-Final.
Any ideas?
Another implementation, way simpler.
This one works with both annotation-based or xml-based configuration: it rely on hibernate meta-data to get the id value for the object. Replace SequenceGenerator by IdentityGenerator (or any other generator) depending on your configuration. (The creation of a decorator instead of subclassing, passing the decorated ID generator as a parameter to this generator, is left as an exercise to the reader).
public class UseExistingOrGenerateIdGenerator extends SequenceGenerator {
#Override
public Serializable generate(SessionImplementor session, Object object)
throws HibernateException {
Serializable id = session.getEntityPersister(null, object)
.getClassMetadata().getIdentifier(object, session);
return id != null ? id : super.generate(session, object);
}
}
Answer to the exercise (using a decorator pattern, as requested), not really tested:
public class UseExistingOrGenerateIdGenerator implements IdentifierGenerator, Configurable {
private IdentifierGenerator defaultGenerator;
#Override
public void configure(Type type, Properties params, Dialect d)
throws MappingException;
// For example: take a class name and create an instance
this.defaultGenerator = buildGeneratorFromParams(
params.getProperty("default"));
}
#Override
public Serializable generate(SessionImplementor session, Object object)
throws HibernateException {
Serializable id = session.getEntityPersister(null, object)
.getClassMetadata().getIdentifier(object, session);
return id != null ? id : defaultGenerator.generate(session, object);
}
}
it works on my project with the following code:
#XmlAttribute
#Id
#Basic(optional = false)
#GeneratedValue(strategy=GenerationType.IDENTITY, generator="IdOrGenerated")
#GenericGenerator(name="IdOrGenerated",
strategy="....UseIdOrGenerate"
)
#Column(name = "ID", nullable = false)
private Integer id;
and
import org.hibernate.id.IdentityGenerator;
...
public class UseIdOrGenerate extends IdentityGenerator {
private static final Logger log = Logger.getLogger(UseIdOrGenerate.class.getName());
#Override
public Serializable generate(SessionImplementor session, Object obj) throws HibernateException {
if (obj == null) throw new HibernateException(new NullPointerException()) ;
if ((((EntityWithId) obj).getId()) == null) {
Serializable id = super.generate(session, obj) ;
return id;
} else {
return ((EntityWithId) obj).getId();
}
}
where you basically define your own ID generator (based on the Identity strategy), and if the ID is not set, you delegate the generation to the default generator.
The main drawback is that it bounds you to Hibernate as JPA provider ... but it works perfectly with my MySQL project
Updating Laurent Grégoire's answer for hibernate 5.2 because it seems to have changed a bit.
public class UseExistingIdOtherwiseGenerateUsingIdentity extends IdentityGenerator {
#Override
public Serializable generate(SharedSessionContractImplementor session, Object object) throws HibernateException {
Serializable id = session.getEntityPersister(null, object).getClassMetadata().getIdentifier(object, session);
return id != null ? id : super.generate(session, object);
}
}
and use it like this: (replace the package name)
#Id
#GenericGenerator(name = "UseExistingIdOtherwiseGenerateUsingIdentity", strategy = "{package}.UseExistingIdOtherwiseGenerateUsingIdentity")
#GeneratedValue(generator = "UseExistingIdOtherwiseGenerateUsingIdentity")
#Column(unique = true, nullable = false)
protected Integer id;
I`m giving a solution here that worked for me:
create your own identifiergenerator/sequencegenerator
public class FilterIdentifierGenerator extends IdentityGenerator implements IdentifierGenerator{
#Override
public Serializable generate(SessionImplementor session, Object object)
throws HibernateException {
// TODO Auto-generated method stub
Serializable id = session.getEntityPersister(null, object)
.getClassMetadata().getIdentifier(object, session);
return id != null ? id : super.generate(session, object);
}
}
modify your entity as:
#Id
#GeneratedValue(generator="myGenerator")
#GenericGenerator(name="myGenerator", strategy="package.FilterIdentifierGenerator")
#Column(unique=true, nullable=false)
private int id;
...
and while saving instead of using persist() use merge() or update()
If you are using hibernate's org.hibernate.id.UUIDGenerator to generate a String id I suggest you use:
public class UseIdOrGenerate extends UUIDGenerator {
#Override
public Serializable generate(SharedSessionContractImplementor session, Object object) throws HibernateException {
Serializable id = session.getEntityPersister(null, object).getClassMetadata().getIdentifier(object, session);
return id != null ? id : super.generate(session, object);
}
}
According to the Selectively disable generation of a new ID thread on the Hibernate forums, merge() might not be the solution (at least not alone) and you might have to use a custom generator (that's the second link you posted).
I didn't test this myself so I can't confirm but I recommend reading the thread of the Hibernate's forums.
For anyone else looking to do this, above does work nicely. Just a recommendation to getting the identifier from the object rather than having inheritance for each Entity class (Just for the Id), you could do something like:
import org.hibernate.id.IdentityGenerator;
public class UseIdOrGenerate extends IdentityGenerator {
private static final Logger log = Logger.getLogger(UseIdOrGenerate.class
.getName());
#Override
public Serializable generate(SessionImplementor session, Object object)
throws HibernateException {
if (object == null)
throw new HibernateException(new NullPointerException());
for (Field field : object.getClass().getDeclaredFields()) {
if (field.isAnnotationPresent(Id.class)
&& field.isAnnotationPresent(GeneratedValue.class)) {
boolean isAccessible = field.isAccessible();
try {
field.setAccessible(true);
Object obj = field.get(object);
field.setAccessible(isAccessible);
if (obj != null) {
if (Integer.class.isAssignableFrom(obj.getClass())) {
if (((Integer) obj) > 0) {
return (Serializable) obj;
}
}
}
} catch (IllegalArgumentException e) {
e.printStackTrace();
} catch (IllegalAccessException e) {
e.printStackTrace();
}
}
}
return super.generate(session, object);
}
}
You need a running transaction.
In case your transaction are manually-managed:
entityManager.getTransaction().begin();
(of course don't forget to commit)
If you are using declarative transactions, use the appropriate declaration (via annotations, most likely)
Also, set the hibernate logging level to debug (log4j.logger.org.hibernate=debug) in your log4j.properties in order to trace what is happening in more details.

Categories