SequenceGenerator to SequenceStyleGenerator moving from hibernate 4.2 to 5 - java

I've recently upgraded my project to use hibernate 5 which was earlier using hibernate 4.2. We have database sequence with naming convention followed as "SEQ_PrimaryKeyName". In hibernate 4.2 we were using org.hibernate.id.SequenceGenerator to generate the sequence - the code looks like this -
public class PrimaryKeyGenerator extends IdentityGenerator implements Configurable {
private PrimaryKeyGenerator pkGen;
public PrimaryKeyGenerator () {
pkGen= new SequenceGenerator();
}
//configure sequence generator
public void configure(Type type, Properties params, Dialect Dialect) throws MappingException {
if (pkGen instanceof Configurable) {
String seqName = "SEQ_" + params.getProperty(PersistentIdentifierGenerator.PK);
params.setProperty(SequenceGenerator.SEQUENCE, seqName);
((Configurable)pkGen).configure(type, params, dialect);
}
}
// Generate sequence
public Serializable generate(SessionImplementor session, Object obj) throws HibernateException {
return pkGen.generate(session, obj);
}
}
SequenceGenerator is deprecated in hibernate 5 and as per javadoc it is recommended to use org.hibernate.id.enhanced.SequenceStyleGenerator.
I modified my existing PrimaryKeyGenerator class to following
public class PrimaryKeyGenerator extends IdentityGenerator implements Configurable {
private PrimaryKeyGenerator pkGen;
public PrimaryKeyGenerator() {
pkGen = new SequenceStyleGenerator();
}
// implement the methods of SequenceStyleGenerator.java
#Override
public void configure(Type type, Properties params, ServiceRegistry serviceRegistry) throws MappingException {
if (pkGen instanceof Configurable) {
String seqName = "SEQ_" + params.getProperty(PersistentIdentifierGenerator.PK);
params.setProperty(SequenceStyleGenerator.SEQUENCE_PARAM, seqName);
((Configurable) pkGen).configure(type, params, serviceRegistry);
}
}
public Serializable generate(SessionImplementor session, Object obj) throws HibernateException {
// collect instance of org.hibernate.boot.model.relational.Database
Database db = MetadataExtractor.INSTANCE.getDatabase();
(SequenceStyleGenerator)pkGen).registerExportables(db);
return pkGen.generate(session, obj);
}
}
With above changes sequence are getting generated properly. The one question I have is that Is it allright to call registerExportable() method before calling generate method ? I am not sure but it looks like registerExportable should be called only once and not every time time generate() method is called. If I don't make a call to registerExportable method explicitly I get following exception.
"SequenceStyleGenerator's SequenceStructure was not properly initialized"
May be the way I am trying to use SequenceStyleGenerator is not correct.

Related

How to use Mongo Auditing and a UUID as id with Spring Boot 2.2.x?

I would like to have Documents stored with an UUID id and createdAt / updatedAt fields. My solution was working with Spring Boot 2.1.x. After I upgraded from Spring Boot 2.1.11.RELEASE to 2.2.0.RELEASE my test for MongoAuditing failed with createdAt = null. What do I need to do to get the createdAt field filled again?
This is not just a testproblem. I ran the application and it has the same behaviour as my test. All auditing fields stay null.
I have a Configuration to enable MongoAuditing and UUID generation:
#Configuration
#EnableMongoAuditing
public class MongoConfiguration {
#Bean
public GenerateUUIDListener generateUUIDListener() {
return new GenerateUUIDListener();
}
}
The listner hooks into the onBeforeConvert - I guess thats where the trouble starts.
public class GenerateUUIDListener extends AbstractMongoEventListener<IdentifiableEntity> {
#Override
public void onBeforeConvert(BeforeConvertEvent<IdentifiableEntity> event) {
IdentifiableEntity entity = event.getSource();
if (entity.isNew()) {
entity.setId(UUID.randomUUID());
}
}
}
The document itself (I dropped the getter and setters):
#Document
public class MyDocument extends InsertableEntity {
private String name;
}
public abstract class InsertableEntity extends IdentifiableEntity {
#CreatedDate
#JsonIgnore
private Instant createdAt;
}
public abstract class IdentifiableEntity implements Persistable<UUID> {
#Id
private UUID id;
#JsonIgnore
public boolean isNew() {
return getId() == null;
}
}
A complete minimal example can be find here (including a test) https://github.com/mab/auditable
With 2.1.11.RELEASE the test succeeds with 2.2.0.RELEASE it fails.
For me the best solution was to switch from event UUID generation to a callback based one. With the implementation of Ordered we can set the new callback to be executed after the AuditingEntityCallback.
public class IdEntityCallback implements BeforeConvertCallback<IdentifiableEntity>, Ordered {
#Override
public IdentifiableEntity onBeforeConvert(IdentifiableEntity entity, String collection) {
if (entity.isNew()) {
entity.setId(UUID.randomUUID());
}
return entity;
}
#Override
public int getOrder() {
return 101;
}
}
I registered the callback with the MongoConfiguration. For a more general solution you might want to take a look at the registration of the AuditingEntityCallback with the `MongoAuditingBeanDefinitionParser.
#Configuration
#EnableMongoAuditing
public class MongoConfiguration {
#Bean
public IdEntityCallback registerCallback() {
return new IdEntityCallback();
}
}
MongoTemplate works in the following way on doInsert()
this.maybeEmitEvent - emit an event (onBeforeConvert, onBeforeSave and such) so any AbstractMappingEventListener can catch and act upon like you did with GenerateUUIDListener
this.maybeCallBeforeConvert - call before convert callbacks like mongo auditing
like you can see in source code of MongoTemplate.class src (831-832)
protected <T> T doInsert(String collectionName, T objectToSave, MongoWriter<T> writer) {
BeforeConvertEvent<T> event = new BeforeConvertEvent(objectToSave, collectionName);
T toConvert = ((BeforeConvertEvent)this.maybeEmitEvent(event)).getSource(); //emit event
toConvert = this.maybeCallBeforeConvert(toConvert, collectionName); //call some before convert handlers
...
}
MongoAudit marks createdAt only to new entities by checking if entity.isNew() == true
because your code (UUID) already set the Id the createdAt is not populated (the entity is not considered new)
you can do the following (order by best to worst):
forget about the UUID and use String for your id, let the mongo itself create and manage it's entities ids (this how MongoTemplate actually works lines 811-812)
keep the UUID at the code level, convert from/to String when inserting and retrieving from the db
create a custom repository like in this post
stay with 2.1.11.RELEASE
set the updateAt by GenerateUUIDListener as well as id (rename it NewEntityListener or smth), basically implement the audit
implement a new isNew() logic that don't depends only on the entity id
in version 2.1.11.RELEASE the order of the methods was flipped (MongoTemplate.class 804-805) so your code worked fine
as an abstract approach, the nature of event is to be sort of send-and-forget (async compatible), so it's a very bad practice to change the object itself, there is NO grantee for order of computation, if any
this is why the audit build on callbacks and not events, and that's why Pivotal don't (need to) keep order between versions

reactive repository throws exception when saving a new object

I am using r2dbc, r2dbc-h2 and experimental spring-boot-starter-data-r2dbc
implementation 'org.springframework.boot.experimental:spring-boot-starter-data-r2dbc:0.1.0.M1'
implementation 'org.springframework.data:spring-data-r2dbc:1.0.0.RELEASE' // starter-data provides old version
implementation 'io.r2dbc:r2dbc-h2:0.8.0.RELEASE'
implementation 'io.r2dbc:r2dbc-pool:0.8.0.RELEASE'
I have created reactive repositories
public interface IJsonComparisonRepository extends ReactiveCrudRepository<JsonComparisonResult, String> {}
Also added a custom script that creates a table in H2 on startup
#SpringBootApplication
public class JsonComparisonApplication {
public static void main(String[] args) {
SpringApplication.run(JsonComparisonApplication.class, args);
}
#Bean
public CommandLineRunner startup(DatabaseClient client) {
return (args) -> client
.execute(() -> {
var resource = new ClassPathResource("ddl/script.sql");
try (var is = new InputStreamReader(resource.getInputStream())) {
return FileCopyUtils.copyToString(is);
} catch (IOException e) {
throw new RuntimeException(e);
} })
.then()
.block();
}
}
My r2dbc configuration looks like this
#Configuration
#EnableR2dbcRepositories
public class R2dbcConfiguration extends AbstractR2dbcConfiguration {
#Override
public ConnectionFactory connectionFactory() {
return new H2ConnectionFactory(
H2ConnectionConfiguration.builder()
.url("mem:testdb;DB_CLOSE_DELAY=-1;DB_CLOSE_ON_EXIT=FALSE")
.username("sa")
.build());
}
}
My service where I perform the logic looks like this
#Override
public Mono<JsonComparisonResult> updateOrCreateRightSide(String comparisonId, String json) {
return updateComparisonSide(comparisonId, storedComparisonResult -> {
storedComparisonResult.setRightSide(json);
return storedComparisonResult;
});
}
private Mono<JsonComparisonResult> updateComparisonSide(String comparisonId,
Function<JsonComparisonResult, JsonComparisonResult> updateSide) {
return repository.findById(comparisonId)
.defaultIfEmpty(createResult(comparisonId))
.filter(result -> ComparisonDecision.NONE == result.getDecision()) // if not NONE - it means it was found and completed
.switchIfEmpty(Mono.error(new NotUpdatableCompleteComparisonException(comparisonId)))
.map(updateSide)
.flatMap(repository::save);
}
private JsonComparisonResult createResult(String comparisonId) {
LOGGER.info("Creating new comparison result: {}.", comparisonId);
var newResult = new JsonComparisonResult();
newResult.setDecision(ComparisonDecision.NONE);
newResult.setComparisonId(comparisonId);
return newResult;
}
The domain looks like this
#Table("json_comparison")
public class JsonComparisonResult {
#Column("comparison_id")
#Id
private String comparisonId;
#Column("left")
private String leftSide;
#Column("right")
private String rightSide;
// #Enumerated(EnumType.STRING) - no support for now
#Column("decision")
private ComparisonDecision decision;
private String differences;
The problem is that when I try to add any object to the database it fails with the exception
org.springframework.dao.TransientDataAccessResourceException: Failed to update table [json_comparison]. Row with Id [4] does not exist.
at org.springframework.data.r2dbc.repository.support.SimpleR2dbcRepository.lambda$save$0(SimpleR2dbcRepository.java:91) ~[spring-data-r2dbc-1.0.0.RELEASE.jar:1.0.0.RELEASE]
at reactor.core.publisher.FluxHandle$HandleSubscriber.onNext(FluxHandle.java:96) ~[reactor-core-3.3.1.RELEASE.jar:3.3.1.RELEASE]
at reactor.core.publisher.FluxOnErrorResume$ResumeSubscriber.onNext(FluxOnErrorResume.java:73) ~[reactor-core-3.3.1.RELEASE.jar:3.3.1.RELEASE]
at reactor.core.publisher.MonoUsingWhen$MonoUsingWhenSubscriber.deferredComplete(MonoUsingWhen.java:276) ~[reactor-core-3.3.1.RELEASE.jar:3.3.1.RELEASE]
at reactor.core.publisher.FluxUsingWhen$CommitInner.onComplete(FluxUsingWhen.java:536) ~[reactor-core-3.3.1.RELEASE.jar:3.3.1.RELEASE]
at reactor.core.publisher.Operators$MultiSubscriptionSubscriber.onComplete(Operators.java:1858) ~[reactor-core-3.3.1.RELEASE.jar:3.3.1.RELEASE]
at reactor.core.publisher.Operators.complete(Operators.java:132) ~[reactor-core-3.3.1.RELEASE.jar:3.3.1.RELEASE]
at reactor.core.publisher.MonoEmpty.subscribe(MonoEmpty.java:45) ~[reactor-core-3.3.1.RELEASE.jar:3.3.1.RELEASE]
at reactor.core.publisher.MonoDefer.subscribe(MonoDefer.java:52) ~[reactor-core-3.3.1.RELEASE.jar:3.3.1.RELEASE]
For some reason during save in SimpleR2dbcRepository library class it doesn't consider the objectToSave as new, but then it fails to update as it is in reality doesn't exist.
// SimpleR2dbcRepository#save
#Override
#Transactional
public <S extends T> Mono<S> save(S objectToSave) {
Assert.notNull(objectToSave, "Object to save must not be null!");
if (this.entity.isNew(objectToSave)) { // not new
....
}
}
Why it is happening and what is the problem?
TL;DR: How should Spring Data know if your object is new or whether it should exist?
Relational Spring Data Repositories (both, JDBC and R2DBC) must differentiate on [Reactive]CrudRepository.save(…) whether the given object is new or whether it exists in your database. Performing a save(…) operation results either in an INSERT or UPDATE statement. Issuing the wrong statement either causes a primary key violation or a no-op as standard SQL does not have a way to express an upsert.
Spring Data JDBC|R2DBC use by default the presence/absence of the #Id value. Generated primary keys are a widely used mechanism. If the primary key is provided, the entity is considered existing. If the id value is null, the entity is considered new.
Read more in the reference documentation about Entity State Detection Strategies.
You have to implement Persistable because you’ve provided the #Id. The library needs to figure out, whether the row is new or whether it should exist. If your entity implements Persistable, then save(…) will use the outcome of isNew() to determine whether to issue an INSERT or UPDATE.
For example:
public class Product implements Persistable<Integer> {
#Id
private Integer id;
private String description;
private Double price;
#Transient
private boolean newProduct;
#Override
#Transient
public boolean isNew() {
return this.newProduct || id == null;
}
public Product setAsNew() {
this.newProduct = true;
return this;
}
}
May be you should consider this:
Choose data type of your id/Primary Key as INT/LONG and set it to AUTO_INCREMENT (something like below):
CREATE TABLE PRODUCT(id INT PRIMARY KEY AUTO_INCREMENT NOT NULL, modelname VARCHAR(30) , year VARCHAR(4), owner VARCHAR(50));
In your post request body, do not include id field.
Removing #ID issued insert statement

Equivalent of MyBatis XML multiple environments in MyBatis Guice

I'm writing a service that needs to use a different database depending on context (a simple string label). Each database has exactly the same schema. The list of databases is dynamic.
Looking through MyBatis-Guice documentation on multiple data sources, the example is where the list of datasources are known upfront, and each datasource has a different mapper. Similarly, a question found here on SO assumes the same requirements.
As stated, my requirements are much more dynamic and fluid. The idea is to have all the currently known databases (with their connection information) in a config and have that parsed at service startup. Then, dependent upon the context of any incoming requests, the code should pull the SqlSessionFactory for the correct database. All downstream code that uses that SqlSessionFactory is exactly the same - i.e. not dependent on request context. Which means the same mappers are used no matter what database is required.
My MyBatis and Guice knowledge is admittedly very new and limited. However, I've not been able to google anything that shows the MyBatis-Guice equivalent to the multiple environment approach supported by the XML configuration of MyBatis.
I managed to come up with a solution that works for me, so thought I'd share it here. The decision to use Guice had already been made, so there was no wriggle room there.
First, I wrote a MyBatis Guice module for registering a single datasource. It is a PrivateModule so that all the MyBatis classes that get registered for one datasource do not conflict with other registrations for other datasources. It makes use of an internal MyBatisModule implementation because Java doesn't support multiple inheritance. Meaning we can't do public class MyMyBatisModule extends PrivateModule, MyBatisModule {...}.
public class MyMyBatisModule extends PrivateModule {
private final String datasourceLabel;
private final Properies datasourceProperties;
private List< Key<?> > exposedKeys = new ArrayList< Key<?> >();
public MyMyBatisModule( String datasourceLabel, Properties datasourceProperties ) {
this.datasourceLabel = datasourceLabel;
this.datasourceProperties = datasourceProperties;
}
#Override
protected void configure() {
install( new InternalMyMyBatisModule( ) );
for( Key<?> key: keys ) {
expose( key );
}
}
private class InternalMyMyBatisModule extends MyBatisModule {
#Override
protected void initialize( ) {
environmentId( datasourceLabel );
Names.bindProperties( binder(), properties );
install( JdbcHelper.MySQL ); // See JDBC Helper commentary below
bindDataSourceProviderType( C3p0DataSourceProvider.class ); // Choose whichever one you want
bindTransactionFactoryType( JdbcTransactionFactory.class );
// Register your mapper classes here. These mapper classes will have their
// keys exposed from the PrivateModule
//
// i.e.
//
// keys.add( registerMapper( FredMapper.class );
// kets.add( registerMapper( GingerMapper.class );
}
private <T> Key<T> registerMapper( Class<T> mapperClass ) {
Key<T> key = Key.get( mapperClass, Names.named( datasourceLabel ) );
bind( key ).to( mapperClass );
addMapperClass( mapperClass );
return key;
}
}
}
JdbcHeler.MySQL: I've used JdbcHelper.MySQL as a shortcut to map properties to the connection string, and use com.mysql.jdbc.Driver as the JDBC driver. It's declared as:
MySQL("jdbc:mysql://${JDBC.host|localhost}:${JDBC.port|3306}/${JDBC.schema}", "com.mysql.jdbc.Driver"),
Now it's time to register all your datasources. MyBatisModules handles this for us. It requires a map of datasourceLabel to jdbc properties.
public class MyBatisModules extends AbstractModule {
private Map< String, Properties > connectionsProperties;
public MyBatisModules( Map< String, Properties > = new HashMap< String, Properties > connectionsProperties ) {
this.connectionsProperties = connectionsProperties; // consider deep copy if appropriate
}
#Override
protected void configure( ) {
for( Entry< String, Properties > datasourceConnectionProperties : this.connectionsProperties.entrySet() ) {
install( new MyMyBatisModule( datasourceConnectionProperties.getKey(), datasourceConnectionProperties.getValue() ) );
}
bind( MapperRetriever.class ); // See MapperRetriever later
// bind your DAO classes here. By wrapping MyBatis Mapper use in DAO implementations, theoretically we
// can fairly easily change from MyBatis to any other database library just by changing the DAO implementation.
// The rest of our codebase would remain the same.
//
// i.e.
//
// bind( FredDao.class ).to( FredDaoMyBatis.class );
// bind( GingerDao.class).to( GingerDaoMyBatis.class );
}
}
Now we just need some way of getting the right Mapper class (which itself is associated with the right datasource). To do this, we actually need to call a method on the Guice Injector. I don't really like the idea of passing that around, so I wrapped it in MapperRetriever. You need to implement a retrieval method for each of your Mappers.
public class MapperRetriever {
private final Injector injector;
#Inject
public MapperRetriver( Injector injector ) {
this.injector = injector;
}
// The follwing two methods use the example Mappers referenced in the MyMyBatisModule implementation above
public FredMapper getFredMapper( String datasourceLabel ) {
return this.injector.getInstance( Key.get( FredMapper.class, Names.named( datasourceLabel ) ) );
}
public GingerMapper getGingerMapper( String datasourceLabel ) {
return this.injector.getInstance( Key.get( GingerMapper.class, Names.named( datasourceLabel ) ) );
}
}
And an example DAO implementation ...
public interface FredDao {
Fred selectFred( String datasourceLable, String fredId );
}
public class FredDaoMyBatis implements FredDao {
private MapperRetriever mapperRetriever;
#Inject
public FredDaoMyBatis( MapperRetriever mapperRetriever ) {
this.mapperRetriever = mapperRetriever;
}
#Override
public Fred selectFred( String datasourceLabel, String fredId ) {
FredMapper fredMapper = this.mapperRetriever.getFredMapper( datasourceLabel );
return fredMapper.getFred( fredId );
}
}
You can also create a custom SqlSessionFactoryProvider which returns a SqlSessionFactory which delegates to the correct DataSource's SqlSessionFactory. Using a ThreadLocal to determine the underlying SqlSessionFactory.
public class DelegatingSqlSessionFactory implements SqlSessionFactory {
private final Map<String, SqlSessionFactory> factories = new HashMap<>();
public DelegatingSqlSessionFactory(Map<String, DataSource> dataSources) throws ClassNotFoundException {
dataSources.forEach((key, ds) -> {
factories.put(key, createSqlSessionFactory(ds));
});
}
private SqlSessionFactory delegate() {
// Read from a ThreadLocal to determine correct SqlSessionFactory key
String key = findKey();
return factories.get(key);
}
#Override
public SqlSession openSession() {
return delegate().openSession();
}
#Override
public SqlSession openSession(boolean autoCommit) {
return delegate().openSession(autoCommit);
}
#Override
public SqlSession openSession(Connection connection) {
return delegate().openSession(connection);
}
#Override
public SqlSession openSession(TransactionIsolationLevel level) {
return delegate().openSession(level);
}
#Override
public SqlSession openSession(ExecutorType execType) {
return delegate().openSession(execType);
}
#Override
public SqlSession openSession(ExecutorType execType, boolean autoCommit) {
return delegate().openSession(execType, autoCommit);
}
#Override
public SqlSession openSession(ExecutorType execType, TransactionIsolationLevel level) {
return delegate().openSession(execType, level);
}
#Override
public SqlSession openSession(ExecutorType execType, Connection connection) {
return delegate().openSession(execType, connection);
}
#Override
public Configuration getConfiguration() {
return delegate().getConfiguration();
}
}

Manual rollback Spring MVC + Hibernate

I am using Spring MVC + Hibernate
//Class for Generic Methods for **save and update**
#Service("PersistenceTemplate")
#Transactional
public class PersistenceTemplate {
#Resource(name = "sessionFactory")
private SessionFactory sessionFactory;
// SAVE
public <T> long save(T entity) throws DataAccessException {
Session session = sessionFactory.getCurrentSession();
long getGenVal=(Long) session.save(entity);
return getGenVal;
}
//UPDATE
public <T> void update(T entity) throws DataAccessException {
sessionFactory.getCurrentSession().update(entity);
}
}
AT Controller
#Resource(name = "PersistenceTemplate")
private PersistenceTemplate pt;
long result=pt.save(receiveTrxObj1);
pt.Update(receiveTrxObj2);
Problem statement
How to roll back save statement if Update fails to update the entity in database ?
You could use application level exception to rollback your entity operations. When this custom exception thrown the related operations rollback. Please see following documents to see how to define custom rollback in Spring.
first your #Service("PersistenceTemplate") should be marked as #Repository because its doing the work of DAO layer.
from the controller you should call a Service which should be annotated with #service and #Transactional and inside this service you create a method which will call a DAO layer.
if save or Update fails to update the entity in database the method from which it is called (ie. the method in service layer) will not complete and the transaction is cancelled automatically because persistence objects are synchronized with database near the end of the completion of method of service layer once the control comes back to it.
See the below example.
#Service("authorLoadService")
#Transactional
#Scope(proxyMode=ScopedProxyMode.TARGET_CLASS,value="request")
public class AuthorEntityLoadService implements EntitiesLoadService{
private AuthorDAO authorDao;//this is my DAO
#Autowired
#Qualifier("authorDAO")
public void setAuthorDao(AuthorDAO authorDao) {
this.authorDao = authorDao;
}
#Override
public void deleteEntities(Object o) {
// TODO Auto-generated method stub
}
#Override
public void loadEntities(Object o) {
Set<author_pojo> author=(Set<author_pojo>)o;
Iterator<author_pojo> itr=author.iterator();
while (itr.hasNext()) {
author_pojo authorPojo = (author_pojo) itr.next();
authorDao.save(authorPojo);
}
}
#Override
#Transactional(readOnly=true)
public List getEntities() {
// TODO Auto-generated method stub
return null;
}
#Override
#Transactional(readOnly=true)
public Object getEntity(Object o) {
String author=(String)o;
author_pojo fetAuthor=authorDao.findOneByName(author);
return fetAuthor;
}
}
My Abstract Generic DAO
public abstract class AbstractHibernateDAO<T extends Serializable> {
public Class<T> clazz;//class object reference
protected SessionFactory mysessionFactory;
#Autowired
public void setMysessionFactory(SessionFactory mysessionFactory) {
this.mysessionFactory = mysessionFactory;
}
public T findOneByName(final String name){
return (T) getCurrentSession().createQuery("from "+clazz.getName()).uniqueResult();
}
public void setClazz(final Class<T> clazzToSet) {
this.clazz = clazzToSet;
}
public T findOne(final Long id) {
return (T) getCurrentSession().get(clazz, id);
}
#SuppressWarnings("unchecked")
public List<T> findAll() {
return getCurrentSession().createQuery("from " + clazz.getName()).list();
}
public void save(final T entity) {
getCurrentSession().merge(entity);
}
public void update(final T entity) {
getCurrentSession().update(entity);
}
public void delete(final T entity) {
getCurrentSession().delete(entity);
}
public void deleteById(final Long entityId) {
final T entity = findOne(entityId);
delete(entity);
}
protected Session getCurrentSession() {
return mysessionFactory.getCurrentSession();
}
}
my concerete DAO
#Repository("authorDAO")
#Scope(proxyMode=ScopedProxyMode.TARGET_CLASS,value="request")
public class AuthorDAO extends AbstractHibernateDAO<author_pojo> {
public AuthorDAO() {
setClazz(author_pojo.class);
}
public author_pojo findOneByName(final String name){
System.out.println(clazz);
return (author_pojo) getCurrentSession().createQuery("from "+clazz.getName() +" where authorName=:name").setParameter("name", name).uniqueResult();
}
}
For you to be able to rollback the save if the update fails, the save and update have to occur within the same transaction. Services are a natural place to put DAO calls that need to execute within the same transaction.
Putting a #Transactional annotation on the controller method would create complications due to proxying the controller, see the Spring MVC documentation, 17.3.2:
A common pitfall when working with annotated controller classes
happens when applying functionality that requires creating a proxy for
the controller object (e.g. #Transactional methods). Usually you will
introduce an interface for the controller in order to use JDK dynamic
proxies. To make this work you must move the #RequestMapping
annotations, as well as any other type and method-level annotations
(e.g. #ModelAttribute, #InitBinder) to the interface as well as the
mapping mechanism can only "see" the interface exposed by the proxy.
Alternatively, you could activate proxy-target-class="true" in the
configuration for the functionality applied to the controller (in our
transaction scenario in ). Doing so indicates
that CGLIB-based subclass proxies should be used instead of
interface-based JDK proxies. For more information on various proxying
mechanisms see Section 9.6, “Proxying mechanisms”.
See this question for what goes in a service as opposed to in a controller.

How can I validate a field as required depending on another field's value in SEAM?

I'm trying to create a simple custom validator for my project, and I can't seem to find a way of getting seam to validate things conditionally.
Here's what I've got:
A helper/backing bean (that is NOT an entity)
#RequiredIfSelected
public class AdSiteHelper {
private Date start;
private Date end;
private boolean selected;
/* getters and setters implied */
}
What I need is for "start" and "end" to be required if and only if selected is true.
I tried creating a custom validator at the TYPE target, but seam doesn't seem to want to pick it up and validate it. (Maybe because it's not an entity?)
here's the general idea of my custom annotation for starters:
#ValidatorClass(RequiredIfSelectedValidator.class)
#Target(ElementType.TYPE)
#Retention(RetentionPolicy.RUNTIME)
public #interface RequiredIfSelected {
String message();
}
public class RequiredIfSelectedValidator implements Validator<RequiredIfSelected>, Serializable {
public boolean isValid(Object value) {
AdSiteHelper ash = (AdSiteHelper) value;
return !ash.isSelected() || (ash.getStart() && ash.getEnd());
}
public void initialize(RequiredIfSelected parameters) { }
}
I had a similar problem covered by this post. If your Bean holding these values is always the same then you could just load the current instance of it into your Validator with
//Assuming you have the #Name annotation populated on your Bean and a Scope of CONVERSATION or higher
AdSiteHelper helper = (AdSiteHelper)Component.getInstance("adSiteHelper");
Also as you're using Seam your validators don't need to be so complex. You don't need an interface and it can be as simple as
#Name("requiredIfSelectedValidator")
#Validator
public class RequiredIfSelectedValidator implements javax.faces.validator.Validator {
public void validate(FacesContext context, UIComponent component, Object value) throws ValidatorException {
//do stuff
}
}

Categories