Related
I'm using Hibernate and QueryDSL along with PostgreSQL on a Spring application, and face some performance issues with my filtered lists. Using the StringPath class, I'm calling either startsWithIgnoreCase, endsWithIgnoreCase or containsIgnoreCase.
It appears the generated query has the following where clause :
WHERE lower(person.firstname) LIKE ? ESCAPE '!'
Using the lower, the query is not taking advantage of the Postgres indexes. On a dev Database, queries take up to 1sec instead of 10ms with the ILIKE keyword.
Is there a way to get a Predicate using Postgres' ILIKE, as Ops doesn't seem to provide it?
Thanks
I've got exactly the same issue - lower(column) causes wrong pg statistics calculation and request is planned not efficiently, ilike solves the problem. I hadn't understood what parts of OP's answer are relevant to solution so reinvented the same approach, but a bit shorter.
Introduce new dialect with my_ilike function and it's implementation:
public class ExtendedPostgresDialect extends org.hibernate.dialect.PostgreSQL9Dialect {
public ExtendedPostgresDialect() {
super();
registerFunction("my_ilike", new SQLFunctionTemplate(BooleanType.INSTANCE, "(?1 ilike ?2)"));
}
}
Specify this dialect to be used by Hibernate (I use Java config):
Properties props = new Properties();
props.setProperty("hibernate.dialect", "com.example.ExtendedPostgresDialect");
factory.setJpaProperties(props);
That's it, now you can use it:
BooleanTemplate.create("function('my_ilike', {0}, {%1%})", stringPath, value).isTrue();
Had to update this :
We found a way to create the needed Postgres operators by registering a SQL function using ilike, in our custom Hibernate Dialect.
Example with ilike :
//Postgres Constants Operators
public class PostgresOperators {
private static final String NS = PostgresOperators.class.getName();
public static final Operator<Boolean> ILIKE = new OperatorImpl<>(NS, "ILIKE");
}
//Custom JPQLTemplates
public class PostgresTemplates extends HQLTemplates {
public static final PostgresTemplates DEFAULT = new PostgresTemplates();
public PostgresTemplates() {
super();
add(PostgresOperators.ILIKE, "my_ilike({0},{1})");
}
}
Specify the JPQLTemplates when using jpaquery
new JPAQuery(entityManager, PostgresTemplates.DEFAULT);
now it gets tricky, we couldn't use ilike directly, there is an issue with an "ilike" keyword already registered, so we made an ilike function and registered it to a custom spring hibernate Dialect.
Our application.yml specifying :
#SEE JPA http://docs.spring.io/spring-boot/docs/current/reference/html/common-application-properties.html
spring.data.jpa:com.example.customDialect.config.database.ExtendedPostgresDialect
Then
public class ExtendedPostgresDialect extends org.hibernate.dialect.PostgreSQL82Dialect {
public ExtendedPostgresDialect() {
super();
registerFunction("my_ilike", new PostgreSQLIlikeFunction());
}
}
We tried to use the registerKeyword("ilike"), didn't work, we stayed with our function and the following implementation.
public class PostgreSQLIlikeFunction implements SQLFunction {
#Override
public Type getReturnType(Type columnType, Mapping mapping)
throws QueryException {
return new BooleanType();
}
#SuppressWarnings("unchecked")
#Override
public String render(Type firstArgumentType, List args, SessionFactoryImplementor factory) throws QueryException {
if (args.size() != 2) {
throw new IllegalArgumentException(
"The function must be passed 2 arguments");
}
String str1 = (String) args.get(0);
String str2 = (String) args.get(1);
return str1 + " ilike " + str2;
}
#Override
public boolean hasArguments() {
return true;
}
#Override
public boolean hasParenthesesIfNoArguments() {
return false;
}
}
That's pretty much it, now we can use ILIKE the following way :
BooleanOperation.create(PostgresOperators.ILIKE, expression1, expression2).isTrue()
I'm new to using JPA and trying to transition my code from JdbcTemplate to JPA. Originally I updated a subset of my columns by taking in a map of the columns with their values and created the SQL Update string myself and executed it using a DAO. I was wondering what would be the best way to do something similar using JPA?
EDIT:
How would I transform this code from my DAO to something equivalent in JPA?
public void updateFields(String userId, Map<String, String> fields) {
StringBuilder sb = new StringBuilder();
for (Entry<String, String> entry : fields.entrySet()) {
sb.append(entry.getKey());
sb.append("='");
sb.append(StringEscapeUtils.escapeEcmaScript(entry.getValue()));
sb.append("', ");
}
String str = sb.toString();
if (str.length() > 2) {
str = str.substring(0, str.length() - 2); // remove ", "
String sql = "UPDATE users_table SET " + str + " WHERE user_id=?";
jdbcTemplate.update(sql, new Object[] { userId },
new int[] { Types.VARCHAR });
}
}
You have to read more about JPA for sure :)
Once entity is in Persistence Context it is tracked by JPA provider till the end of persistence context life or until EntityManager#detach() method is called. When transaction finishes (commit) - the state of managed entities in persistence context is synchronized with database and all changes are made.
If your entity is new, you can simply put it in the persistece context by invoking EntityManager#persist() method.
In your case (update of existing entity), you have to get a row from database and somehow change it to entity. It can be done in many ways, but the simpliest is to call EntityManager#find() method which will return managed entity. Returned object will be also put to current persistence context, so if there is an active transaction, you can change whatever property you like (not the primary key) and just finish transaction by invoking commit (or if this is container managed transaction just finish method).
update
After your comment I can see your point. I think you should redesign your app to fit JPA standards and capabilities. Anyway - if you already have a map of pairs <Attribute_name, Attrbute_value>, you can make use of something called Metamodel. Simple usage is shown below. This is naive implementation and works good only with basic attributes, you should take care of relationships etc. (access to more informations about attributes can be done via methods attr.getJavaType() or attr.getPersistentAttributeType())
Metamodel meta = entityManager.getMetamodel();
EntityType<User> user_ = meta.entity(User.class);
CriteriaBuilder cb = entityManager.getCriteriaBuilder();
CriteriaUpdate<User> update = cb.createCriteriaUpdate(User.class);
Root e = update.from(User.class);
for( Attribute<? super User, ?> attr : user_.getAttributes() ) {
if (map.containsKey(attr.getName())) {
update.set(attr, map.get(attr));
}
}
update.where(cb.equal(e.get("id"), idOfUser));
entityManager.createQuery(update).executeUpdate();
Please note that Update Criteria Queries are available in JPA since 2.1 version.
Here you can find more informations about metamodel generation.
Alternatively to metamodel you can just use java reflection mechanisms.
JPA handles the update. Retrieve a dataset as entity using the entitymanager, change the value and call persist. This will store the changed data in your db.
In case you are using Hibernate(as JPA provider), here's an example
Entity
#Entity
#Table(name="PERSON")
public class Person {
#Id #GeneratedValue(strategy=GenerationType.IDENTITY)
private int id;
#Column(name="NAME", nullable=false)
private String name;
other fields....
}
DAO
public interface PersonDao {
Person findById(int id);
void persist(Person person);
...
}
DaoImpl
#Repository("personDao")
public class PersonDaoImpl extends AnAbstractClassWithSessionFactory implements PersonDao {
public Person findById(int id) {
return (Person) getSession().get(Person.class, id);
}
public void persist(Person person){
getSession().persist(person);
}
}
Service
#Service("personService")
#Transactional
public class PersonServiceImpl implements PersonService {
#Autowired
PersonDao personDao;
#Override
public void createAndPersist(SomeSourceObject object) {
//create Person object and populates with the source object
Person person = new Person();
person.name = object.name;
...
personDao.persist(person);
}
#Override
public Person findById(int id) {
return personDao.findById(id);
}
public void doSomethingWithPerson(Person person) {
person.setName(person.getName()+" HELLO ");
//here since we are in transaction, no need to explicitly call update/merge
//it will be updated in db as soon as the methods completed successfully
//OR
//changes will be undone if transaction failed/rolledback
}
}
JPA documentation are indeed good resource for details.
From design point of view, if you have web interfacing, i tends to say include one more service delegate layer(PersonDelegateService e.g.) which maps the actual data received from UI to person entity (and viceversa, for display, to populate the view object from person entity) and delegate to service for actual person entity processing.
I am new to Java and I'm trying to implement a basic database access layer.
I'm using Apache DBUtils to reduce JDBC boilerplate code and this is working really well.
The problem is that my implementation uses a separate class for CRUD for each table in my database and it feels wrong to be duplicating so much functionality.
Is this an acceptable design and if not what can I do to reduce code duplication?
Could I refactor my solution to use generics in some fashion?
I realize I could use an ORM (myBatis, Hibernate etc) as a solution but I would like to try to stick with DBUtils and plain JDBC if I can help it.
Just for clarification:
Lets say I have 2 tables...
---------------------
User | File
---------------------
userId | fileId
name | path
age | size
---------------------
In my current solution I would create 2 classes (UserStore, FileStore) and
each class would implement similar basic CRUD methods:
protected boolean Create(User newUser)
{
QueryRunner run = new QueryRunner(dataSource);
try
{
run.update("INSERT INTO User (name, age) " +
"VALUES (?, ?)", newUser.getName(), newUser.getAge());
}
catch (SQLException ex)
{
Log.logException(ex);
return false;
}
return true;
}
protected User Read(int userId)
{
try
{
User user = run.query("SELECT * FROM User WHERE userId = ? ", userId);
return user;
}
catch (SQLException ex)
{
Log.logException(ex);
return null;
}
}
protected update(User user)
{
... perform database query etc
}
protected delete(int userId)
{
... perform database query etc
}
You asked how i would do this with Template method. Here is an example how you could do it:
public class AbstractDAO<T> {
private String table;
private String id_field;
public AbstractDAO(String table, String id_field){
this.table = table;
...
}
public T read(int id){
try
{
T user = run.query("SELECT * FROM "+ table + " WHERE "+id_field +" = ? ", id);
return user;
}
catch (SQLException ex)
{
Log.logException(ex);
return null;
}
}
This one looks easy, how about Create?
public boolean Create(T user){
QueryRunner run = new QueryRunner(dataSource);
try
{
run.update("INSERT INTO "+table+ getFields() +
"VALUES " + getParameters(user));
}
catch (SQLException ex)
{
Log.logException(ex);
return false;
}
return true;
}
protected abstract String getFields();
protected abstract String getParameters(T user);
Ugly, and insecure, but okay for transmitting the idea.
It looks like I can give you few simple suggestions for your question.
1)
Instead of managing queries inside DAOs like what you are doing, make a factory class that has list of queries for your needs.
like
class QueryFactory {
static String INSERT_BOOK = "BLAH";
static String DELETE_BOOK = "BLAH";
}
This will separate queries from DAO code and make it easier to manage.
2)
Implement a generic DAO
http://www.codeproject.com/Articles/251166/The-Generic-DAO-pattern-in-Java-with-Spring-3-and
3) As you have mentioned above, use ORM to help yourself binding beans to Database and many more features.
Using DBUtils you have already abstracted away a lot of boilerplate code. What remains is mostly the work that differs between entities : The right sql statement, transformation of entity objects into UPDATE parameters and vice versa with SELECTs, exception handling.
Unfortunately it is not easy to create a general abstraction that is flexible enough for these remaining tasks. That's what ORM mappers are all about. I would still suggest to look into one of these. If you stick to the JPA API, you are still in standards land and able to switch the ORM provider more easily (although there is always some coupling).
I was impressed by SpringData's Repository abstraction. In simple use cases they give you zero code DAO's. If you are already using Spring and just want to persist your object model you should definitely look into it.
Alternatively I made some good experiences with jooq. It can also create DTO's and corresponding DAO's based on the tables in your schema. In contrast to ORM mappers it is closer to the relational schema, which may be an advantage or a disadvantage.
mybatis allows you to return a resulttype as hashmap:
<select id="mCount" resultType="hashmap">
select managerName, count(reportees) AS count
from mgr_employee
group by managerName;
</select>
So you can effectively write out in a workflow like this:
1) Develop an interface
2a) Use mybatis annotation to define the query required OR
2b) Link the interface to a xml and write the query
Take note that this will not be involving any DAO and other boilerplates involve as above
Which ORM supports a domain model of immutable types?
I would like to write classes like the following (or the Scala equivalent):
class A {
private final C c; //not mutable
A(B b) {
//init c
}
A doSomething(B b) {
// build a new A
}
}
The ORM has to initialized the object with the constructor. So it is possible to check invariants in the constructor. Default constructor and field/setter access to intialize is not sufficient and complicates the class' implementation.
Working with collections should be supported. If a collection is changed it should create a copy from the user perspective. (Rendering the old collection state stale. But user code can still work on (or at least read) it.) Much like the persistent data structures work.
Some words about the motivation. Suppose you have a FP-style domain object model. Now you want to persist this to a database. Who do you do that? You want to do as much as you can in a pure functional style until the evil sides effect come in. If your domain object model is not immutable you can for example not share the objects between threads. You have to copy, cache or use locks. So unless your ORM supports immutable types your constrainted in your choice of solution.
UPDATE: I created a project focused on solving this problem called JIRM:
https://github.com/agentgt/jirm
I just found this question after implementing my own using Spring JDBC and Jackson Object Mapper. Basically I just needed some bare minimum SQL <-> immutable object mapping.
In short I just use Springs RowMapper and Jackson's ObjectMapper to map Objects back and forth from the database. I use JPA annotations just for metadata (like column name etc...). If people are interested I will clean it up and put it on github (right now its only in my startup's private repo).
Here is a rough idea how it works here is an example bean (notice how all the fields are final):
//skip imports for brevity
public class TestBean {
#Id
private final String stringProp;
private final long longProp;
#Column(name="timets")
private final Calendar timeTS;
#JsonCreator
public TestBean(
#JsonProperty("stringProp") String stringProp,
#JsonProperty("longProp") long longProp,
#JsonProperty("timeTS") Calendar timeTS ) {
super();
this.stringProp = stringProp;
this.longProp = longProp;
this.timeTS = timeTS;
}
public String getStringProp() {
return stringProp;
}
public long getLongProp() {
return longProp;
}
public Calendar getTimeTS() {
return timeTS;
}
}
Here what the RowMapper looks like (notice it mainly delegats to Springs ColumnMapRowMapper and then uses Jackson's objectmapper):
public class SqlObjectRowMapper<T> implements RowMapper<T> {
private final SqlObjectDefinition<T> definition;
private final ColumnMapRowMapper mapRowMapper;
private final ObjectMapper objectMapper;
public SqlObjectRowMapper(SqlObjectDefinition<T> definition, ObjectMapper objectMapper) {
super();
this.definition = definition;
this.mapRowMapper = new SqlObjectMapRowMapper(definition);
this.objectMapper = objectMapper;
}
public SqlObjectRowMapper(Class<T> k) {
this(SqlObjectDefinition.fromClass(k), new ObjectMapper());
}
#Override
public T mapRow(ResultSet rs, int rowNum) throws SQLException {
Map<String, Object> m = mapRowMapper.mapRow(rs, rowNum);
return objectMapper.convertValue(m, definition.getObjectType());
}
}
Now I just took Spring JDBCTemplate and gave it a fluent wrapper. Here are some examples:
#Before
public void setUp() throws Exception {
dao = new SqlObjectDao<TestBean>(new JdbcTemplate(ds), TestBean.class);
}
#Test
public void testAll() throws Exception {
TestBean t = new TestBean(IdUtils.generateRandomUUIDString(), 2L, Calendar.getInstance());
dao.insert(t);
List<TestBean> list = dao.queryForListByFilter("stringProp", "hello");
List<TestBean> otherList = dao.select().where("stringProp", "hello").forList();
assertEquals(list, otherList);
long count = dao.select().forCount();
assertTrue(count > 0);
TestBean newT = new TestBean(t.getStringProp(), 50, Calendar.getInstance());
dao.update(newT);
TestBean reloaded = dao.reload(newT);
assertTrue(reloaded != newT);
assertTrue(reloaded.getStringProp().equals(newT.getStringProp()));
assertNotNull(list);
}
#Test
public void testAdding() throws Exception {
//This will do a UPDATE test_bean SET longProp = longProp + 100
int i = dao.update().add("longProp", 100).update();
assertTrue(i > 0);
}
#Test
public void testRowMapper() throws Exception {
List<Crap> craps = dao.query("select string_prop as name from test_bean limit ?", Crap.class, 2);
System.out.println(craps.get(0).getName());
craps = dao.query("select string_prop as name from test_bean limit ?")
.with(2)
.forList(Crap.class);
Crap c = dao.query("select string_prop as name from test_bean limit ?")
.with(1)
.forObject(Crap.class);
Optional<Crap> absent
= dao.query("select string_prop as name from test_bean where string_prop = ? limit ?")
.with("never")
.with(1)
.forOptional(Crap.class);
assertTrue(! absent.isPresent());
}
public static class Crap {
private final String name;
#JsonCreator
public Crap(#JsonProperty ("name") String name) {
super();
this.name = name;
}
public String getName() {
return name;
}
}
Notice in the above how easy it is to map any query into immutable POJO's. That is you don't need it 1-to-1 of entity to table. Also notice the use of Guava's optionals (last query.. scroll down). I really hate how ORM's either throw exceptions or return null.
Let me know if you like it and I'll spend the time putting it on github (only teste with postgresql). Otherwise with the info above you can easily implement your own using Spring JDBC. I'm starting to really dig it because immutable objects are easier to understand and think about.
Hibernate has the #Immutable annotation.
And here is a guide.
Though not a real ORM, MyBatis may able to do this. I didn't try it though.
http://mybatis.org/java.html
AFAIK, there are no ORMs for .NET supporting this feature exactly as you wish. But you can take a look at BLTookit and LINQ to SQL - both provide update-by-comparison semantics and always return new objects on materialization. That's nearly what you need, but I'm not sure about collections there.
Btw, why you need this feature? I'm aware about pure functional languages & benefits of purely imutable objects (e.g. complete thread safety). But in case with ORM all the things you do with such objects are finally transformed to a sequence of SQL commands anyway. So I admit the benefits of using such objects are vaporous here.
You can do this with Ebean and OpenJPA (and I think you can do this with Hibernate but not sure). The ORM (Ebean/OpenJPA) will generate a default constructor (assuming the bean doesn't have one) and actually set the values of the 'final' fields. This sounds a bit odd but final fields are not always strictly final per say.
SORM is a new Scala ORM which does exactly what you want. The code below will explain it better than any words:
// Declare a model:
case class Artist ( name : String, genres : Set[Genre] )
case class Genre ( name : String )
// Initialize SORM, automatically generating schema:
import sorm._
object Db extends Instance (
entities = Set() + Entity[Artist]() + Entity[Genre](),
url = "jdbc:h2:mem:test"
)
// Store values in the db:
val metal = Db.save( Genre("Metal") )
val rock = Db.save( Genre("Rock") )
Db.save( Artist("Metallica", Set() + metal + rock) )
Db.save( Artist("Dire Straits", Set() + rock) )
// Retrieve values from the db:
val metallica = Db.query[Artist].whereEqual("name", "Metallica").fetchOne() // Option[Artist]
val rockArtists = Db.query[Artist].whereEqual("genres.name", "Rock").fetch() // Stream[Artist]
(note: I'm quite familiar with Java, but not with Hibernate or JPA - yet :) )
I want to write an application which talks to a DB2/400 database through JPA and I have now that I can get all entries in the table and list them to System.out (used MyEclipse to reverse engineer). I understand that the #Table annotation results in the name being statically compiled with the class, but I need to be able to work with a table where the name and schema are provided at runtime (their defintion are the same, but we have many of them).
Apparently this is not SO easy to do, and I'd appreciate a hint.
I have currently chosen Hibernate as the JPA provider, as it can handle that these database tables are not journalled.
So, the question is, how can I at runtime tell the Hibernate implementation of JPA that class A corresponds to database table B?
(edit: an overridden tableName() in the Hibernate NamingStrategy may allow me to work around this intrinsic limitation, but I still would prefer a vendor agnostic JPA solution)
You need to use the XML version of the configuration rather than the annotations. That way you can dynamically generate the XML at runtime.
Or maybe something like Dynamic JPA would interest you?
I think it's necessary to further clarify the issues with this problem.
The first question is: are the set of tables where an entity can be stored known? By this I mean you aren't dynamically creating tables at runtime and wanting to associate entities with them. This scenario calls for, say, three tables to be known at compile-time. If that is the case you can possibly use JPA inheritance. The OpenJPA documentation details the table per class inheritance strategy.
The advantage of this method is that it is pure JPA. It comes with limitations however, being that the tables have to be known and you can't easily change which table a given object is stored in (if that's a requirement for you), just like objects in OO systems don't generally change class or type.
If you want this to be truly dynamic and to move entities between tables (essentially) then I'm not sure JPA is the right tool for you. An awful lot of magic goes into making JPA work including load-time weaving (instrumentation) and usually one or more levels of caching. What's more the entity manager needs to record changes and handle updates of managed objects. There is no easy facility that I know of to instruct the entity manager that a given entity should be stored in one table or another.
Such a move operation would implicitly require a delete from one table and insertion into another. If there are child entities this gets more difficult. Not impossible mind you but it's such an unusual corner case I'm not sure anyone would ever bother.
A lower-level SQL/JDBC framework such as Ibatis may be a better bet as it will give you the control that you want.
I've also given thought to dynamically changing or assigning at annotations at runtime. While I'm not yet sure if that's even possible, even if it is I'm not sure it'd necessarily help. I can't imagine an entity manager or the caching not getting hopelessly confused by that kind of thing happening.
The other possibility I thought of was dynamically creating subclasses at runtime (as anonymous subclasses) but that still has the annotation problem and again I'm not sure how you add that to an existing persistence unit.
It might help if you provided some more detail on what you're doing and why. Whatever it is though, I'm leaning towards thinking you need to rethink what you're doing or how you're doing it or you need to pick a different persistence technology.
You may be able to specify the table name at load time via a custom ClassLoader that re-writes the #Table annotation on classes as they are loaded. At the moment, I am not 100% sure how you would ensure Hibernate is loading its classes via this ClassLoader.
Classes are re-written using the ASM bytecode framework.
Warning: These classes are experimental.
public class TableClassLoader extends ClassLoader {
private final Map<String, String> tablesByClassName;
public TableClassLoader(Map<String, String> tablesByClassName) {
super();
this.tablesByClassName = tablesByClassName;
}
public TableClassLoader(Map<String, String> tablesByClassName, ClassLoader parent) {
super(parent);
this.tablesByClassName = tablesByClassName;
}
#Override
public Class<?> loadClass(String name) throws ClassNotFoundException {
if (tablesByClassName.containsKey(name)) {
String table = tablesByClassName.get(name);
return loadCustomizedClass(name, table);
} else {
return super.loadClass(name);
}
}
public Class<?> loadCustomizedClass(String className, String table) throws ClassNotFoundException {
try {
String resourceName = getResourceName(className);
InputStream inputStream = super.getResourceAsStream(resourceName);
ClassReader classReader = new ClassReader(inputStream);
ClassWriter classWriter = new ClassWriter(0);
classReader.accept(new TableClassVisitor(classWriter, table), 0);
byte[] classByteArray = classWriter.toByteArray();
return super.defineClass(className, classByteArray, 0, classByteArray.length);
} catch (IOException e) {
throw new RuntimeException(e);
}
}
private String getResourceName(String className) {
Type type = Type.getObjectType(className);
String internalName = type.getInternalName();
return internalName.replaceAll("\\.", "/") + ".class";
}
}
The TableClassLoader relies on the TableClassVisitor to catch the visitAnnotation method calls:
public class TableClassVisitor extends ClassAdapter {
private static final String tableDesc = Type.getDescriptor(Table.class);
private final String table;
public TableClassVisitor(ClassVisitor visitor, String table) {
super(visitor);
this.table = table;
}
#Override
public AnnotationVisitor visitAnnotation(String desc, boolean visible) {
AnnotationVisitor annotationVisitor;
if (desc.equals(tableDesc)) {
annotationVisitor = new TableAnnotationVisitor(super.visitAnnotation(desc, visible), table);
} else {
annotationVisitor = super.visitAnnotation(desc, visible);
}
return annotationVisitor;
}
}
The TableAnnotationVisitor is ultimately responsible for changing the name field of the #Table annotation:
public class TableAnnotationVisitor extends AnnotationAdapter {
public final String table;
public TableAnnotationVisitor(AnnotationVisitor visitor, String table) {
super(visitor);
this.table = table;
}
#Override
public void visit(String name, Object value) {
if (name.equals("name")) {
super.visit(name, table);
} else {
super.visit(name, value);
}
}
}
Because I didn't happen to find an AnnotationAdapter class in ASM's library, here is one I made myself:
public class AnnotationAdapter implements AnnotationVisitor {
private final AnnotationVisitor visitor;
public AnnotationAdapter(AnnotationVisitor visitor) {
this.visitor = visitor;
}
#Override
public void visit(String name, Object value) {
visitor.visit(name, value);
}
#Override
public AnnotationVisitor visitAnnotation(String name, String desc) {
return visitor.visitAnnotation(name, desc);
}
#Override
public AnnotationVisitor visitArray(String name) {
return visitor.visitArray(name);
}
#Override
public void visitEnd() {
visitor.visitEnd();
}
#Override
public void visitEnum(String name, String desc, String value) {
visitor.visitEnum(name, desc, value);
}
}
It sounds to me like what you're after is Overriding the JPA Annotations with an ORM.xml.
This will allow you to specify the Annotations but then override them only where they change. I've done the same to override the schema in the #Table annotation as it changes between my environments.
Using this approach you can also override the table name on individual entities.
[Updating this answer as it's not well documented and someone else may find it useful]
Here's my orm.xml file (note that I am only overriding the schema and leaving the other JPA & Hibernate annotations alone, however changing the table here is totally possible. Also note that I am annotating on the field not the Getter)
<?xml version="1.0" encoding="UTF-8"?>
<entity-mappings
xmlns="http://java.sun.com/xml/ns/persistence/orm"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://java.sun.com/xml/ns/persistence/orm orm_2_0.xsd"
version="1.0">
<package>models.jpa.eglobal</package>
<entity class="MyEntityOne" access="FIELD">
<table name="ENTITY_ONE" schema="MY_SCHEMA"/>
</entity>
<entity class="MyEntityTwo" access="FIELD">
<table name="ENTITY_TWO" schema="MY_SCHEMA"/>
</entity>
</entity-mappings>
as alternative of XML configuration, you may want to dynamically generate java class with annotation using your preferred bytecode manipulation framework
If you don't mind binding your self to Hibernate, you could use some of the methods described at https://www.hibernate.org/171.html . You may find your self using quite a few hibernate annotations depending on the complexity of your data, as they go above and beyond the JPA spec, so it may be a small price to pay.