I'm developing a distributed software system and I'm trying to use the Hibernate framework for the first time. One of my classes (which is a mapped Entity) has an ArrayList<> containing objects of another class (which is also an Entity).
Ex:
I have an Event class, which is a #MappedSuperclass.
I have a CorporateEvent class, which is an entity, that inherits from Event.
I have a Task class, which is also an Entity.
My CorporateEvent class has an attribute called "tasks", which is an ArrayList.
How do I map this correctly to the database so that there's a table called Event and a table called Task, which has a key connecting them to eachother?
I'm using the newest Hibernate version and the database is on a PostgreSQL server. The dialect in Hibernate is set to PostgreSQL.
We've tried using #ElementCollection and #CollectionTable, but we're getting some weird NullPointerExceptions. It might have something to do with the inheritance aspect of the code.
Here is the 3 classes in question:
Event Class:
#MappedSuperclass
#Inheritance(strategy = InheritanceType.TABLE_PER_CLASS)
public abstract class Event {
#Column(name="title")
private String title;
#Column(name="location")
private String location;
#Column(name="date")
private String date;
CorporateEvent Class:
#Entity
#Table(name="corporate_event")
public class CorporateEvent extends Event {
#Id
#GeneratedValue(strategy = GenerationType.SEQUENCE, generator = "id_generator")
#SequenceGenerator(name = "id_generator", sequenceName = "corporate_event_id_seq", allocationSize = 1)
#Column(name = "id", updatable = false, nullable = false)
private int id;
#ElementCollection
private ArrayList<Task> tasks;
#Column(name="expenses")
private String expenses;
Task Class:
#Entity
#Table(name="task")
public class Task {
#Id
#GeneratedValue(strategy = GenerationType.SEQUENCE, generator = "id_generator")
#SequenceGenerator(name="id_generator", sequenceName = "task_task_id_seq", allocationSize=1)
#Column(name="task_id", updatable = false, nullable = false)
private int id;
#Column(name="name")
private String taskName;
#Column(name="description")
private String description;
I expect the output to be that a CorporateEvent object is saved to the database and the Task objects in its ArrayList are each saved in the Task table with a CorporateEvent ID that connects them to the right event.
What I instead get is a NullPointerException on the line where I try to save the CorporateEvent object using Hibernate. This is the error message:
------------------------------------------------------------------------
Building Eventer 1.0-SNAPSHOT
------------------------------------------------------------------------
--- maven-resources-plugin:2.5:resources (default-resources) # Eventer ---
[debug] execute contextualize
Using 'UTF-8' encoding to copy filtered resources.
Copying 11 resources
--- maven-compiler-plugin:3.6.1:compile (default-compile) # Eventer ---
Nothing to compile - all classes are up to date
--- exec-maven-plugin:1.2.1:exec (default-cli) # Eventer ---
Nov 08, 2019 5:31:30 PM org.hibernate.Version logVersion
INFO: HHH000412: Hibernate Core {5.4.8.Final}
Nov 08, 2019 5:31:30 PM org.hibernate.annotations.common.reflection.java.JavaReflectionManager <clinit>
INFO: HCANN000001: Hibernate Commons Annotations {5.1.0.Final}
Nov 08, 2019 5:31:31 PM org.hibernate.engine.jdbc.connections.internal.DriverManagerConnectionProviderImpl configure
WARN: HHH10001002: Using Hibernate built-in connection pool (not for production use!)
Nov 08, 2019 5:31:31 PM org.hibernate.engine.jdbc.connections.internal.DriverManagerConnectionProviderImpl buildCreator
INFO: HHH10001005: using driver [org.postgresql.Driver] at URL [jdbc:postgresql://tek-mmmi-db0a.tek.c.sdu.dk:5432/si3_2019_group_5_db]
Nov 08, 2019 5:31:31 PM org.hibernate.engine.jdbc.connections.internal.DriverManagerConnectionProviderImpl buildCreator
INFO: HHH10001001: Connection properties: {user=si3_2019_group_5, password=****}
Nov 08, 2019 5:31:31 PM org.hibernate.engine.jdbc.connections.internal.DriverManagerConnectionProviderImpl buildCreator
INFO: HHH10001003: Autocommit mode: false
Nov 08, 2019 5:31:31 PM org.hibernate.engine.jdbc.connections.internal.DriverManagerConnectionProviderImpl$PooledConnections <init>
INFO: HHH000115: Hibernate connection pool size: 1 (min=1)
Nov 08, 2019 5:31:31 PM org.hibernate.dialect.Dialect <init>
INFO: HHH000400: Using dialect: org.hibernate.dialect.PostgreSQLDialect
Nov 08, 2019 5:31:35 PM org.hibernate.cfg.AnnotationBinder bindClass
WARN: HHH000503: A class should not be annotated with both #Inheritance and #MappedSuperclass. #Inheritance will be ignored for: com.mycompany.domain.event.Event.
Nov 08, 2019 5:31:35 PM org.hibernate.boot.internal.InFlightMetadataCollectorImpl addIdentifierGenerator
WARN: HHH000069: Duplicate generator name id_generator
Exception in thread "main" java.lang.NullPointerException
at com.mycompany.repositories.EventRepository.saveCorporateEvent(EventRepository.java:51)
at com.mycompany.eventer.Eventer.main(Eventer.java:42)
------------------------------------------------------------------------
BUILD FAILURE
------------------------------------------------------------------------
Total time: 7.007s
Finished at: Fri Nov 08 17:31:35 CET 2019
Final Memory: 9M/100M
------------------------------------------------------------------------
Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.2.1:exec (default-cli) on project Eventer: Command execution failed. Process exited with an error: 1 (Exit value: 1) -> [Help 1]
To see the full stack trace of the errors, re-run Maven with the -e switch.
Re-run Maven using the -X switch to enable full debug logging.
The line it complains about is "session.close();" in this method:
public int saveCorporateEvent(CorporateEvent corporateEvent) {
try {
session = ConnectRepository.factory.getCurrentSession();
session.beginTransaction();
session.save(corporateEvent);
session.getTransaction().commit();
return corporateEvent.getId();
} catch (Exception e) {
e.printStackTrace();
} finally {
session.close();
}
return 0;
}
And if I remove the session.close(); I get the following annotation error, which I don't understand, since an ArrayList should be supported?:
(The beginning is the same, so I'm only pasting the difference in)
Exception in thread "main" java.lang.ExceptionInInitializerError
at com.mycompany.repositories.EventRepository.saveCorporateEvent(EventRepository.java:42)
at com.mycompany.eventer.Eventer.main(Eventer.java:42)
Caused by: org.hibernate.AnnotationException: java.util.ArrayList collection type not supported for property: com.mycompany.domain.event.CorporateEvent.tasks
at org.hibernate.cfg.annotations.CollectionBinder.getCollectionBinder(CollectionBinder.java:317)
at org.hibernate.cfg.AnnotationBinder.processElementAnnotations(AnnotationBinder.java:1939)
at org.hibernate.cfg.AnnotationBinder.processIdPropertiesIfNotAlready(AnnotationBinder.java:975)
at org.hibernate.cfg.AnnotationBinder.bindClass(AnnotationBinder.java:802)
at org.hibernate.boot.model.source.internal.annotations.AnnotationMetadataSourceProcessorImpl.processEntityHierarchies(AnnotationMetadataSourceProcessorImpl.java:254)
at org.hibernate.boot.model.process.spi.MetadataBuildingProcess$1.processEntityHierarchies(MetadataBuildingProcess.java:230)
at org.hibernate.boot.model.process.spi.MetadataBuildingProcess.complete(MetadataBuildingProcess.java:273)
at org.hibernate.boot.model.process.spi.MetadataBuildingProcess.build(MetadataBuildingProcess.java:83)
at org.hibernate.boot.internal.MetadataBuilderImpl.build(MetadataBuilderImpl.java:473)
at org.hibernate.boot.internal.MetadataBuilderImpl.build(MetadataBuilderImpl.java:84)
at org.hibernate.cfg.Configuration.buildSessionFactory(Configuration.java:689)
at org.hibernate.cfg.Configuration.buildSessionFactory(Configuration.java:724)
at com.mycompany.repositories.ConnectRepository.<clinit>(ConnectRepository.java:27)
... 2 more
------------------------------------------------------------------------
BUILD FAILURE
------------------------------------------------------------------------
Total time: 11.877s
Finished at: Fri Nov 08 17:44:27 CET 2019
Final Memory: 19M/208M
------------------------------------------------------------------------
Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.2.1:exec (default-cli) on project Eventer: Command execution failed. Process exited with an error: 1 (Exit value: 1) -> [Help 1]
To see the full stack trace of the errors, re-run Maven with the -e switch.
Re-run Maven using the -X switch to enable full debug logging.
UPDATE:
Now it's somewhat working. But a new problem has appeared. As you can see below, hibernate only inserts into the CorporateEvent table, not the task table.
------------------------------------------------------------------------
Building Eventer 1.0-SNAPSHOT
------------------------------------------------------------------------
--- maven-resources-plugin:2.5:resources (default-resources) # Eventer ---
[debug] execute contextualize
Using 'UTF-8' encoding to copy filtered resources.
Copying 11 resources
--- maven-compiler-plugin:3.6.1:compile (default-compile) # Eventer ---
Nothing to compile - all classes are up to date
--- exec-maven-plugin:1.2.1:exec (default-cli) # Eventer ---
Nov 10, 2019 12:15:21 PM org.hibernate.Version logVersion
INFO: HHH000412: Hibernate Core {5.4.8.Final}
Nov 10, 2019 12:15:21 PM org.hibernate.annotations.common.reflection.java.JavaReflectionManager <clinit>
INFO: HCANN000001: Hibernate Commons Annotations {5.1.0.Final}
Nov 10, 2019 12:15:22 PM org.hibernate.engine.jdbc.connections.internal.DriverManagerConnectionProviderImpl configure
WARN: HHH10001002: Using Hibernate built-in connection pool (not for production use!)
Nov 10, 2019 12:15:22 PM org.hibernate.engine.jdbc.connections.internal.DriverManagerConnectionProviderImpl buildCreator
INFO: HHH10001005: using driver [org.postgresql.Driver] at URL [jdbc:postgresql://tek-mmmi-db0a.tek.c.sdu.dk:5432/si3_2019_group_5_db]
Nov 10, 2019 12:15:22 PM org.hibernate.engine.jdbc.connections.internal.DriverManagerConnectionProviderImpl buildCreator
INFO: HHH10001001: Connection properties: {user=si3_2019_group_5, password=****}
Nov 10, 2019 12:15:22 PM org.hibernate.engine.jdbc.connections.internal.DriverManagerConnectionProviderImpl buildCreator
INFO: HHH10001003: Autocommit mode: false
Nov 10, 2019 12:15:22 PM org.hibernate.engine.jdbc.connections.internal.DriverManagerConnectionProviderImpl$PooledConnections <init>
INFO: HHH000115: Hibernate connection pool size: 1 (min=1)
Nov 10, 2019 12:15:22 PM org.hibernate.dialect.Dialect <init>
INFO: HHH000400: Using dialect: org.hibernate.dialect.PostgreSQLDialect
Nov 10, 2019 12:15:26 PM org.hibernate.cfg.AnnotationBinder bindClass
WARN: HHH000503: A class should not be annotated with both #Inheritance and #MappedSuperclass. #Inheritance will be ignored for: com.mycompany.domain.event.Event.
Nov 10, 2019 12:15:26 PM org.hibernate.boot.internal.InFlightMetadataCollectorImpl addIdentifierGenerator
WARN: HHH000069: Duplicate generator name id_generator
Nov 10, 2019 12:15:26 PM org.hibernate.boot.internal.InFlightMetadataCollectorImpl addIdentifierGenerator
WARN: HHH000069: Duplicate generator name id_generator
Hibernate: select nextval ('corporate_event_id_seq')
Hibernate: insert into corporate_event (date, description, location, max_participants, title, expenses, corporate_id) values (?, ?, ?, ?, ?, ?, ?)
------------------------------------------------------------------------
BUILD SUCCESS
------------------------------------------------------------------------
Total time: 9.780s
Finished at: Sun Nov 10 12:15:28 CET 2019
Final Memory: 9M/100M
------------------------------------------------------------------------
These are the changes I made using your help:
CorporateEvent class:
#OneToMany(cascade=CascadeType.ALL, mappedBy="corporateEvent")
private List<Task> tasks;
Task class:
#ManyToOne
#JoinColumn(name="corporate_id")
private CorporateEvent corporateEvent;
The code runs successfully, but it will not map the data to the Task table in my database. This is the test (main) code:
public static void main(String[] args) {
ArrayList<Task> arrTasks = new ArrayList<>();
arrTasks.add(new Task("task1", "taskDes1"));
arrTasks.add(new Task("task2", "taskDes2"));
arrTasks.add(new Task("task3", "taskDes3"));
User user = new User("alex", "tholle", "sdu", "software engineering", "at#gmail.com", 70111213, "alexuser", "alexpass", new Date(), true, "student", "e2R213");
Meetup meetup = new Meetup(user.getUserName(), "meetup at the pub", "old irish", new Date().toString(), "det bliver fed", 10);
Task task = new Task("Cleaning", "Do some cleaing please");
CorporateEvent corporateEvent = new CorporateEvent(null, arrTasks, "Tinderbox", "forest", new Date().toString(), "edm music festival", 5000);
for (Task t : arrTasks) {
t.setCorporateEvent(corporateEvent);
}
EventRepository eventRepo = new EventRepository();
eventRepo.saveCorporateEvent(corporateEvent);
The database has the following tables regarding this issue:
create table task
(
task_id serial not null
constraint task_pk
primary key,
name varchar,
description varchar(1000),
corporate_id integer
constraint corporate_id
references corporate_event
on update cascade on delete cascade
);
alter table task
owner to si3_2019_group_5;
create unique index task_task_id_uindex
on task (task_id);
create table corporate_event
(
title varchar,
date varchar(1000),
location varchar,
description varchar,
max_participants integer,
expenses varchar(2000),
corporate_id serial not null
constraint corporate_event_pk
primary key,
tasks varchar(4000)
);
alter table corporate_event
owner to si3_2019_group_5;
create unique index corporate_event_id_uindex
on corporate_event (corporate_id);
Your error is likely due to your usage of the #ElementCollection annotation without a class that is annotated with #Embeddable. I would use a OneToMany annotation implementation instead.
Your CorporateEvent class:
#OneToMany(mappedBy="corporate_event")
private List<Task> tasks;
Your Task class:
#ManyToOne
#JoinColumn(name="corporate_event_id")
private CorporateEvent corporateEvent;
Related
I have a very large dataset, and want to update certain entity kinds. I am exploring MapReduce library in GoogleAppEngine. I have followed the examples listed here.
https://github.com/GoogleCloudPlatform/appengine-mapreduce/tree/master/java/example/src/com/google/appengine/demos/mapreduce/entitycount
What I am basically doing is this, in my MapSpecification
MapSpecification<Entity, Entity, Void> spec = new MapSpecification.Builder<>(
new DatastoreKeyInput(query,2),
new UrlFlattenMapper(),
new DatastoreOutput())
.setJobName("Flatten URLs entities")
.build();
and My Mapper basically performs the operations on the Entity and then Emits it, for the DatastoreOutput writer to write it back into the database.
My problem is, the Entities are getting updated fine. The endSlice is also being called in my MapperTask. But the Jobs is not completing. I keep getting these errors
[INFO] INFO: RetryHelper(28.07 ms, 1 attempts, java.util.concurrent.Executors$RunnableAdapter#7f0264e0): Attempt #1 failed [java.lang.RuntimeException: Can't serialize object: MapOnlyShardTask[context=IncrementalTaskContext[jobId=3c041e68-5041-458c-994b-290cd941f8bb, shardNumber=1, shardCount=2, lastWorkItem=Topics("jzdh"), workerCallCount=297, workerTimeMillis=42513], inputExhausted=true, isFirstSlice=false]], sleeping for 1028 ms
[INFO] Apr 26, 2016 4:39:37 PM com.google.appengine.tools.cloudstorage.RetryHelper doRetry
[INFO] INFO: RetryHelper(1.085 s, 2 attempts, java.util.concurrent.Executors$RunnableAdapter#7f0264e0): Attempt #2 failed [java.lang.RuntimeException: Can't serialize object: MapOnlyShardTask[context=IncrementalTaskContext[jobId=3c041e68-5041-458c-994b-290cd941f8bb, shardNumber=1, shardCount=2, lastWorkItem=Topics("jzdh"), workerCallCount=297, workerTimeMillis=42513], inputExhausted=true, isFirstSlice=false]], sleeping for 2435 ms
[INFO] Apr 26, 2016 4:39:37 PM com.google.appengine.tools.cloudstorage.RetryHelper doRetry
[INFO] INFO: RetryHelper(3.562 s, 3 attempts, java.util.concurrent.Executors$RunnableAdapter#6d7fcd47): Attempt #3 failed [java.lang.RuntimeException: Can't serialize object: MapOnlyShardTask[context=IncrementalTaskContext[jobId=3c041e68-5041-458c-994b-290cd941f8bb, shardNumber=0, shardCount=2, lastWorkItem=Topics("jz63"), workerCallCount=289, workerTimeMillis=41536], inputExhausted=true, isFirstSlice=false]], sleeping for 3421 ms
[INFO] Apr 26, 2016 4:39:39 PM com.google.appengine.tools.cloudstorage.RetryHelper doRetry
[INFO] INFO: RetryHelper(3.567 s, 3 attempts, java.util.concurrent.Executors$RunnableAdapter#7f0264e0): Attempt #3 failed [java.lang.RuntimeException: Can't serialize object: MapOnlyShardTask[context=IncrementalTaskContext[jobId=3c041e68-5041-458c-994b-290cd941f8bb, shardNumber=1, shardCount=2, lastWorkItem=Topics("jzdh"), workerCallCount=297, workerTimeMillis=42513], inputExhausted=true, isFirstSlice=false]], sleeping for 3340 ms
[INFO] Apr 26, 2016 4:39:41 PM com.google.appengine.tools.cloudstorage.RetryHelper doRetry
[INFO] INFO: RetryHelper(7.015 s, 4 attempts, java.util.concurrent.Executors$RunnableAdapter#6d7fcd47): Attempt #4 failed [java.lang.RuntimeException: Can't serialize object: MapOnlyShardTask[context=IncrementalTaskContext[jobId=3c041e68-5041-458c-994b-290cd941f8bb, shardNumber=0, shardCount=2, lastWorkItem=Topics("jz63"), workerCallCount=289, workerTimeMillis=41536], inputExhausted=true, isFirstSlice=false]], sleeping for 6941 ms
[INFO] Apr 26, 2016 4:39:42 PM com.google.appengine.tools.cloudstorage.RetryHelper doRetry
I havent been able to get around this issue, any help or pointers on what I could be doing would be greatly appreciated.
The Culprit in My case is a small Datastore field I have used in the Map Job. I put a transient in front of the field, and the issue was solved,
I am migrating an application from Hibernate 4.3 to Hibernate 5.0.1-Final
I use ImplicitNamingStrategyComponentPathImpl as my hibernate.implicit_naming_strategy with Postgres 9.4.4 and my company uses hibernate.hbm2ddl.auto = update for deployment ( I know it is a bad practice but cant help it)
While the session factory initializes, it throws the below error. Apparently the generated alias is too long for Postgres. How do we go about this situation? I have tried assigning #Table(name=..) annotation to work around this it but it is getting worse as every relationship from that point gets screwd.
Caused by: org.hibernate.tool.schema.spi.SchemaManagementException: Unable to execute schema management to JDBC target [create table public.ReferenceDocumentVersion_ReferenceDocumentSourceFilesStoreDescriptor (ReferenceDocumentVersion_unid uuid not null, sourceFilesStore_filesDescriptorMap_unid uuid not null, filesDescriptorMap_KEY text not null, primary key (ReferenceDocumentVersion_unid, filesDescriptorMap_KEY))]
at org.hibernate.tool.schema.internal.TargetDatabaseImpl.accept(TargetDatabaseImpl.java:59)
at org.hibernate.tool.schema.internal.SchemaMigratorImpl.applySqlString(SchemaMigratorImpl.java:371)
at org.hibernate.tool.schema.internal.SchemaMigratorImpl.applySqlStrings(SchemaMigratorImpl.java:360)
at org.hibernate.tool.schema.internal.SchemaMigratorImpl.createTable(SchemaMigratorImpl.java:181)
at org.hibernate.tool.schema.internal.SchemaMigratorImpl.doMigrationToTargets(SchemaMigratorImpl.java:134)
at org.hibernate.tool.schema.internal.SchemaMigratorImpl.doMigration(SchemaMigratorImpl.java:59)
at org.hibernate.tool.hbm2ddl.SchemaUpdate.execute(SchemaUpdate.java:129)
at org.hibernate.tool.hbm2ddl.SchemaUpdate.execute(SchemaUpdate.java:97)
at org.hibernate.internal.SessionFactoryImpl.<init>(SessionFactoryImpl.java:481)
at org.hibernate.boot.internal.SessionFactoryBuilderImpl.build(SessionFactoryBuilderImpl.java:444)
at org.hibernate.jpa.boot.internal.EntityManagerFactoryBuilderImpl.build(EntityManagerFactoryBuilderImpl.java:802)
... 29 more
Caused by: org.postgresql.util.PSQLException: ERROR: relation "referencedocumentversion_referencedocumentsourcefilesstoredescr" already exists
at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2182)
at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:1911)
at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:173)
at org.postgresql.jdbc2.AbstractJdbc2Statement.execute(AbstractJdbc2Statement.java:618)
at org.postgresql.jdbc2.AbstractJdbc2Statement.executeWithFlags(AbstractJdbc2Statement.java:454)
at org.postgresql.jdbc2.AbstractJdbc2Statement.executeUpdate(AbstractJdbc2Statement.java:382)
at org.apache.tomcat.dbcp.dbcp.DelegatingStatement.executeUpdate(DelegatingStatement.java:228)
at org.apache.tomcat.dbcp.dbcp.DelegatingStatement.executeUpdate(DelegatingStatement.java:228)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at net.bull.javamelody.JdbcWrapper.doExecute(JdbcWrapper.java:404)
at net.bull.javamelody.JdbcWrapper$StatementInvocationHandler.invoke(JdbcWrapper.java:129)
at net.bull.javamelody.JdbcWrapper$DelegatingInvocationHandler.invoke(JdbcWrapper.java:286)
at com.sun.proxy.$Proxy93.executeUpdate(Unknown Source)
at org.hibernate.tool.schema.internal.TargetDatabaseImpl.accept(TargetDatabaseImpl.java:56)
... 39 more
I have addressed the situation with a custom ImplicitNamingStrategy that truncates Hibernate generated identifiers to 64 chars (MAX length for Postgres).
Previous versions of Hibernate(4.x) have encountered the same error but they just ignores it and proceeds with initializing the SessionFactory. However, Hibernate 5.x has a new boot strap API which throws a SchemaManagementException in such cases and aborts. Hibernate logs from my test scenarios are pasted below for reference.
Hibernate 4.X
INFO: HHH000396: Updating schema
Oct 04, 2015 1:38:00 PM org.hibernate.tool.hbm2ddl.DatabaseMetadata getTableMetadata
INFO: HHH000262: Table not found: ReferenceDocumentVersionEntityWithAReallyReallyReallyLongNameBeyondPostGres
Oct 04, 2015 1:38:00 PM org.hibernate.tool.hbm2ddl.DatabaseMetadata getTableMetadata
INFO: HHH000262: Table not found: ReferenceDocumentVersionEntityWithAReallyReallyReallyLongNameBeyondPostGres
Oct 04, 2015 1:38:00 PM org.hibernate.tool.hbm2ddl.DatabaseMetadata getTableMetadata
INFO: HHH000262: Table not found: ReferenceDocumentVersionEntityWithAReallyReallyReallyLongNameBeyondPostGres
Oct 04, 2015 1:38:00 PM org.hibernate.tool.hbm2ddl.SchemaUpdate execute
ERROR: HHH000388: Unsuccessful: create table ReferenceDocumentVersionEntityWithAReallyReallyReallyLongNameBeyondPostGres (unid uuid not null, path text, primary key (unid))
Oct 04, 2015 1:38:00 PM org.hibernate.tool.hbm2ddl.SchemaUpdate execute
ERROR: ERROR: relation "referencedocumentversionentitywithareallyreallyreallylongnamebe" already exists
Oct 04, 2015 1:38:00 PM org.hibernate.tool.hbm2ddl.SchemaUpdate execute
INFO: HHH000232: Schema update complete
Hibernate 5.0.2.Final
Oct 04, 2015 1:39:16 PM org.hibernate.tool.hbm2ddl.SchemaUpdate execute
INFO: HHH000228: Running hbm2ddl schema update
Oct 04, 2015 1:39:16 PM org.hibernate.tool.schema.extract.internal.InformationExtractorJdbcDatabaseMetaDataImpl processGetTableResults
INFO: HHH000262: Table not found: ReferenceDocumentVersionEntityWithAReallyReallyReallyLongNameBeyondPostGres
Oct 04, 2015 1:39:16 PM org.hibernate.tool.schema.extract.internal.InformationExtractorJdbcDatabaseMetaDataImpl processGetTableResults
INFO: HHH000262: Table not found: ReferenceDocumentVersionEntityWithAReallyReallyReallyLongNameBeyondPostGres
Tests run: 1, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 0.813 sec <<< FAILURE!
testApp(org.foobar.AppTest) Time elapsed: 0.788 sec <<< ERROR!
javax.persistence.PersistenceException: [PersistenceUnit: org.foobar.persistence.default] Unable to build Hibernate SessionFactory
at org.hibernate.jpa.boot.internal.EntityManagerFactoryBuilderImpl.persistenceException(EntityManagerFactoryBuilderImpl.java:877)
at org.hibernate.jpa.boot.internal.EntityManagerFactoryBuilderImpl.build(EntityManagerFactoryBuilderImpl.java:805)
at org.hibernate.jpa.HibernatePersistenceProvider.createEntityManagerFactory(HibernatePersistenceProvider.java:58)
at javax.persistence.Persistence.createEntityManagerFactory(Persistence.java:55)
at javax.persistence.Persistence.createEntityManagerFactory(Persistence.java:39)
at org.foobar.AppTest.testApp(AppTest.java:18)
Solution
Custom ImplicitNamingStrategy
package org.foobar.persistence;
import org.hibernate.boot.model.naming.Identifier;
import org.hibernate.boot.model.naming.ImplicitNamingStrategyComponentPathImpl;
import org.hibernate.boot.spi.MetadataBuildingContext;
public class PGConstrainedImplicitNamingStrategy extends ImplicitNamingStrategyComponentPathImpl {
private static final int POSTGRES_IDENTIFIER_MAXLENGTH = 63;
public static final PGConstrainedImplicitNamingStrategy INSTANCE = new PGConstrainedImplicitNamingStrategy();
public PGConstrainedImplicitNamingStrategy() {
}
#Override
protected Identifier toIdentifier(String stringForm, MetadataBuildingContext buildingContext) {
return buildingContext.getMetadataCollector()
.getDatabase()
.getJdbcEnvironment()
.getIdentifierHelper()
.toIdentifier( stringForm.substring( 0, Math.min( POSTGRES_IDENTIFIER_MAXLENGTH, stringForm.length() ) ) );
}}
persistence.xml
<properties>
<property name="hibernate.implicit_naming_strategy" value="org.foobar.persistence.PGConstrainedImplicitNamingStrategy"/>
</properties>
This is not a scalable solution at all but helps to keep the show running. The permanent solution would be to explicitly supply identifiers so that hibernate does not generate really long identifiers. - see the answer written by maaartinus
try to follow the Migration guide in Hibernate Documentation in this link
https://github.com/hibernate/hibernate-orm/blob/5.0/migration-guide.adoc
The OP's solution may lead to collision (that's why he calls it not scalable, right?). Explicitly supplying all identifiers sound like a terrible idea to me. I'd suggest one of the following
provide a Map<String, String> mapping all overlong names to something shorter
shorten all overlong names to POSTGRES_IDENTIFIER_MAXLENGTH - N and append N characters generated from the hash of the cut away part, so the probability of collisions gets minimized
Use some identifier abbreviating function like {"Reference" -> "Ref", "Document" -> "Doc", ...} and apply it to your identifiers before they get processed, so that you get RefDocVersion_RefDocSourceFileDescr... instead of referencedocumentversion_referencedocumentsourcefilesstoredescr....
Consider using abbreviated names in you code itself. This is often advised against, as it easily leads to incomprehensible non-sense, but IMHO it increases readability when used right (use only a couple of abbreviations and use them systematically; provide a list of them).
I am developing a java application that executes ssh commands using Ganymed SSH-2
I need to produce full logs for each sequence of commands, e.g. zip file transfer, unzipping, zipping etc..
Having searched the source code for ch.ethz.ssh2.log.Logger i can set the boolean public static volatile boolean enabled = false; to true
this provides the following output
Mar 05, 2015 10:17:25 AM ch.ethz.ssh2.log.Logger info
INFO: kex_algo=diffie-hellman-group-exchange-sha1
Mar 05, 2015 10:17:25 AM ch.ethz.ssh2.log.Logger info
INFO: server_host_key_algo=ssh-rsa
Mar 05, 2015 10:17:25 AM ch.ethz.ssh2.log.Logger info
INFO: enc_algo_client_to_server=aes128-ctr
Mar 05, 2015 10:17:25 AM ch.ethz.ssh2.log.Logger info
INFO: enc_algo_server_to_client=aes128-ctr
Mar 05, 2015 10:17:25 AM ch.ethz.ssh2.log.Logger info
INFO: mac_algo_client_to_server=hmac-sha1-96
Mar 05, 2015 10:17:25 AM ch.ethz.ssh2.log.Logger info
INFO: mac_algo_server_to_client=hmac-sha1-96
Mar 05, 2015 10:17:25 AM ch.ethz.ssh2.log.Logger info
INFO: comp_algo_client_to_server=none
Mar 05, 2015 10:17:25 AM ch.ethz.ssh2.log.Logger info
INFO: comp_algo_server_to_client=none
However I also require ALL level logging for command execution including file transfers.
How do i configure the Logger to produce all the information available?
A little late answer but maybe someone still needs this info.
I managed to get the debug statements visible like this:
public void enableFineLogging() {
try {
ch.ethz.ssh2.log.Logger.enabled = true;
String name = "myDynamicFileNamePart";
FileHandler fileHandler = new FileHandler("./logs/"
+ name + "_SFTP.log", 10000000, 1000, true);
fileHandler.setLevel(Level.FINE);
fileHandler.setFormatter(new SimpleFormatter());
final Logger app = Logger.getLogger("ch.ethz");
app.setLevel(Level.FINE);
app.addHandler(fileHandler);
app.setUseParentHandlers(false);
} catch (Exception e) {
// Catchalog
}
}
With result in file with:
marras 15, 2017 12:16:56 IP. org.slf4j.impl.JCLLoggerAdapter info
INFO: Client identity string: SSH-2.0-SSHJ_0.19.1
marras 15, 2017 12:16:56 IP. org.slf4j.impl.JCLLoggerAdapter info
INFO: Server identity string: SSH-2.0-OpenSSH_6.6.1
marras 15, 2017 12:16:56 IP. org.slf4j.impl.JCLLoggerAdapter debug
FINE: Setting <> to null
marras 15, 2017 12:16:56 IP. org.slf4j.impl.JCLLoggerAdapter debug
FINE: Sending SSH_MSG_KEXINIT
marras 15, 2017 12:16:56 IP. org.slf4j.impl.JCLLoggerAdapter debug
FINE: Setting <> to SOME
marras 15, 2017 12:16:56 IP. org.slf4j.impl.JCLLoggerAdapter debug
FINE: Awaiting <>
marras 15, 2017 12:16:56 IP. org.slf4j.impl.JCLLoggerAdapter debug
FINE: Received SSH_MSG_KEXINIT
Use ConsoleHandler if you wish logs in console.
Also closing the log file needs to be considered with fileHandler.close() after you quit logging.
Tune the log level by choosing from SEVERE, WARNING, INFO, CONFIG, FINE, FINER, FINEST, ALL
After I update some Entities in GWT, I would like to save them. However, when I try to persist them, it does not save when I look in the AppEngine admin interface. The Boolean has not changed.
Code
EntityManager em = EMF.get().createEntityManager();
for (OnixUser s: admin) {
log.info(s.email + ", " + s.isAdmin);
em.merge(s);
}
em.close();
Update with transaction
EntityManager em = EMF.get().createEntityManager();
em.getTransaction().begin();
for (OnixUser s: admin) {
log.info(s.email + ", " + s.isAdmin);
OnixUser merged = em.merge(s);
em.persist(merged);
// em.persist(s);
}
em.getTransaction().commit();
em.close();
Still did not save. No exceptions thrown.
Log
Oct 16, 2013 3:19:10 PM com.example.sdm.server.SDMServiceImpl setAdmin
INFO: chloe#example.com, true
App Engine admin interface for OnixUser entity
Log at FINEST level
FINE: Created ManagedConnection using DatastoreService = com.google.appengine.api.datastore.DatastoreServiceImpl#2fd9270d
Oct 16, 2013 4:03:14 PM org.datanucleus.store.connection.ConnectionManagerImpl allocateConnection
FINE: Connection added to the pool : com.google.appengine.datanucleus.DatastoreConnectionFactoryImpl$DatastoreManagedConnection#31c1f89d for key=org.datanucleus.ObjectManagerImpl#6977c57b in factory=ConnectionFactory:tx[com.google.appengine.datanucleus.DatastoreConnectionFactoryImpl#2b1f5f6b]
Oct 16, 2013 4:03:14 PM com.example.sdm.server.SDMServiceImpl setAdmin
INFO: chloe#example.com, true
Oct 16, 2013 4:03:14 PM org.datanucleus.state.LifeCycleState changeState
FINE: Object "com.example.sdm.shared.OnixUser#48ef6e99" (id="com.example.sdm.shared.OnixUser:6456332278300672") has a lifecycle change : "P_CLEAN"->"P_NONTRANS"
Oct 16, 2013 4:03:14 PM org.datanucleus.store.connection.ConnectionManagerImpl$1 managedConnectionPostClose
FINE: Connection removed from the pool : com.google.appengine.datanucleus.DatastoreConnectionFactoryImpl$DatastoreManagedConnection#31c1f89d for key=org.datanucleus.ObjectManagerImpl#6977c57b in factory=ConnectionFactory:tx[com.google.appengine.datanucleus.DatastoreConnectionFactoryImpl#2b1f5f6b]
Oct 16, 2013 4:03:14 PM org.datanucleus.state.LifeCycleState changeState
FINE: Object "com.example.sdm.shared.OnixUser#48ef6e99" (id="com.example.sdm.shared.OnixUser:6456332278300672") has a lifecycle change : "P_NONTRANS"->"DETACHED_CLEAN"
Oct 16, 2013 4:03:14 PM com.google.apphosting.utils.jetty.AppEngineAuthentication$AppEngineUserRealm disassociate
FINE: Ignoring disassociate call for: chloe#example.com
If inspite of using transactions your data is not persisted then best way would be to try the logging to see what's happening underneath. As you using DataNucleus as your persistence provider, you can refer to this link to configure
the SQL logging. The information relevant to you is given near the end of page.
I am working on Mahout and found an issue when I tried to change my csv, previously it was giving me proper recommendations.
Example code:
model = new FileDataModel(new File("E:\\WriteTest.csv"));
UserSimilarity similarity = new PearsonCorrelationSimilarity(model);
UserNeighborhood neighborhood = new NearestNUserNeighborhood(2,similarity,model);
Recommender recomender = new GenericUserBasedRecommender(model,neighborhood, similarity);
List<RecommendedItem> recommendations = recomender.recommend(1,1);
for(RecommendedItem recommendation: recommendations){
System.out.println(recommendation);
}
I have just updated the values of my csv and it has stopped giving me suggestion.
CSV that is not giving me any result:
1,13,9.9
1,26,9.0
1,40,4.0
2,83,9.9
2,167,9.0
2,250,4.0
3,91,9.9
3,167,9.0
3,274,4.0
4,91,9.9
4,167,2.0
CSV which is giving me result:
1,101,5.0
1,102,3.0
1,103,3.0
2,101,5.0
2,102,2.5
2,103,3.0
2,104,2.1
3,101,5.0
3,102,2.5
3,105,4.0
3,107,5.0
4,102,2.0
4,104,4.0
4,105,2.5
4,106,3.0
4,107,2.6
5,101,5.0
5,102,3.4
5,104,2.5
5,105,2.5
5,106,1.0
Output on console respectively:
Result from 1st Dataset Aug 27, 2011 2:45:06 AM
org.slf4j.impl.JCLLoggerAdapter info INFO: Creating FileDataModel for
file WriteTest.csv Aug 27, 2011 2:45:06 AM
org.slf4j.impl.JCLLoggerAdapter info INFO: Reading file info... Aug
27, 2011 2:45:06 AM org.slf4j.impl.JCLLoggerAdapter info INFO:
Readlines: 11 Aug 27, 2011 2:45:06 AM org.slf4j.impl.JCLLoggerAdapter
info INFO: Processed 4 users
I was expecting Item no 167 but din't find any recommendation
Output of 2nd dataset:
Aug 27, 2011 2:52:42 AM org.slf4j.impl.JCLLoggerAdapter info
INFO: Creating FileDataModel for file WriteTest.csv
Aug 27, 2011 2:52:42 AM org.slf4j.impl.JCLLoggerAdapter info
INFO: Reading file info...
Aug 27, 2011 2:52:42 AM org.slf4j.impl.JCLLoggerAdapter info
INFO: Read lines: 21
Aug 27, 2011 2:52:42 AM org.slf4j.impl.JCLLoggerAdapter info
INFO: Processed 5 users
RecommendedItem[item:105, value:3.25]
The recommender is working correctly. The problem is that your data is too sparse. It cannot find a similarity that would link two users such that 167 is recommendable. Try a more realistic data set and I think the behavior will look less surprising.