I would like to know why spring jpa isn't deleting the rows that are causing a constraint error or a way to make it possible to edit my Account entity with new roles.
The following entities are involved in the action i'm trying to perform.
#Entity
#Table(name = "account")
class AccountEntity(uuid: UUID? = null,
#Column(nullable = false) val email: String,
#Column(nullable = false) val password: String,
#OneToMany(
mappedBy = "accountUuid",
cascade = [CascadeType.ALL],
fetch = FetchType.LAZY
) val accountRolesEntity: List<AccountRolesEntity>) : BaseEntity(uuid)
#Entity
#Table(name = "account_roles")
class AccountRolesEntity(uuid: UUID? = null,
#Column(nullable = false) val accountUuid: UUID,
#OneToOne val role: RoleEntity) : BaseEntity(uuid)
#Entity
#Table(name = "role")
class RoleEntity(uuid: UUID? = null,
#Column(nullable = false) val name: String ) : BaseEntity(uuid)
So i'm trying to update the roles of a specific account.
For example:
If X has roles 'viewer' and 'editor' and suppose i want to change it to viewer only.
I do the following steps:
Request account entity from database
set new accountRolesEntity (received from controller) to account
Call the jpa repository save method
Method in Service class:
fun updateExistingAccount(account: AccountDTO, adjustedRoles: List<RoleDTO>): AccountDTO {
val mappedRoles: List<AccountRolesEntity> = adjustedRoles.map { accountRolesMapper.map(account.uuid, it) }
val accountEntity = accountMapper.map(account, mappedRoles)
return accountMapper.map(accountRepository.save(accountEntity))
}
The error i'm getting is: org.postgresql.util.PSQLException: ERROR: duplicate key value violates unique constraint "account_roles_account_uuid_role_uuid_key"
This is because i have a constrain in my database to make sure that an account may not have duplicate roles. The create table statement is as following:
CREATE TABLE account_roles (
uuid UUID PRIMARY KEY,
account_uuid UUID NOT NULL REFERENCES account(uuid),
role_uuid UUID NOT NULL REFERENCES role(uuid),
UNIQUE (account_uuid, role_uuid)
);
There is a fix for this by performing all the actions 1 by 1: Delete first and then make new inserts. But there should be a better way for this.
actually "AccountRolesEntity" is not an entity, it's a table which keeps the relationships of two entity by keeping ids of that entities into the table.
so for the first step, you should have something like this,
#JoinTable(name = "account_role",
joinColumns = #JoinColumn(name = "account")
, inverseJoinColumns = #JoinColumn(name = "role"))
#OneToMany(mappedBy = "accountUuid",
cascade = [CascadeType.ALL],
fetch = FetchType.LAZY)
val accountRolesEntity: List<AccountRolesEntity>) :BaseEntity(uuid)
and I think your problem depends on your cascades and your class diagram so check them again.
Related
I cannot get the below OneToMany mappings to work properly, even though they are supposedly validated (by hibernate.ddl-auto=validate). I can insert all entities in the application with no problems, but while doing a findAll or findById, the queries Hibernate generates for me are wrong and result in exceptions. This is very likely due to a problem with my OneToMany mappings, or lack of a ManyToOne mapping but I don't see how to make it work.
Currently, the following tables exist in my postgres12 database:
CREATE TABLE battlegroups (
id uuid,
gameworld_id uuid,
name varchar(255),
PRIMARY KEY(id)
);
CREATE TABLE battlegroup_players (
id uuid,
battlegroup_id uuid,
player_id integer,
name varchar(255),
tribe varchar(255),
PRIMARY KEY (id)
);
CREATE TABLE battlegroup_player_villages(
battlegroup_id uuid,
player_id integer,
village_id integer,
x integer,
y integer,
village_name varchar(255),
tribe varchar(255),
PRIMARY KEY(battlegroup_id, player_id, village_id, x, y)
);
These are mapped to the following entities in Kotlin:
#Entity
#Table(name = "battlegroups")
class BattlegroupEntity(
#Id
val id: UUID,
#Column(name = "gameworld_id")
val gameworldId: UUID,
val name: String? = "",
#OneToMany(mappedBy = "battlegroupId", cascade = [CascadeType.ALL],fetch = FetchType.EAGER)
private val players: MutableList<BattlegroupPlayerEntity>)
#Entity
#Table(name = "battlegroup_players")
class BattlegroupPlayerEntity(#Id
val id: UUID,
#Column(name = "battlegroup_id")
val battlegroupId: UUID,
#Column(name = "player_id")
val playerId: Int,
val name: String,
#Enumerated(EnumType.STRING)
val tribe: Tribe,
#OneToMany(mappedBy= "id.playerId" , cascade = [CascadeType.ALL], fetch = FetchType.EAGER)
val battlegroupPlayerVillages: MutableList<BattlegroupPlayerVillageEntity>)
#Entity
#Table(name = "battlegroup_player_villages")
class BattlegroupPlayerVillageEntity(
#EmbeddedId
val id: BattlegroupPlayerVillageId,
#Column(name ="village_name")
val villageName: String,
#Enumerated(EnumType.STRING)
val tribe: Tribe)
#Embeddable
data class BattlegroupPlayerVillageId(
#Column(name = "battlegroup_id")
val battlegroupId: UUID,
#Column(name = "player_id")
val playerId: Int,
#Column(name = "village_id")
val villageId: Int,
val x: Int,
val y: Int
): Serializable
This is the SQL hibernate generates when I do a findAll/findById on a battlegroup:
select
battlegrou0_.id as id1_2_0_,
battlegrou0_.gameworld_id as gameworl2_2_0_,
battlegrou0_.name as name3_2_0_,
players1_.battlegroup_id as battlegr2_1_1_,
players1_.id as id1_1_1_,
players1_.id as id1_1_2_,
players1_.battlegroup_id as battlegr2_1_2_,
players1_.name as name3_1_2_,
players1_.player_id as player_i4_1_2_,
players1_.tribe as tribe5_1_2_,
battlegrou2_.player_id as player_i2_0_3_,
battlegrou2_.battlegroup_id as battlegr1_0_3_,
battlegrou2_.village_id as village_3_0_3_,
battlegrou2_.x as x4_0_3_,
battlegrou2_.y as y5_0_3_,
battlegrou2_.battlegroup_id as battlegr1_0_4_,
battlegrou2_.player_id as player_i2_0_4_,
battlegrou2_.village_id as village_3_0_4_,
battlegrou2_.x as x4_0_4_,
battlegrou2_.y as y5_0_4_,
battlegrou2_.tribe as tribe6_0_4_,
battlegrou2_.village_name as village_7_0_4_
from
battlegroups battlegrou0_
left outer join
battlegroup_players players1_
on battlegrou0_.id=players1_.battlegroup_id
left outer join
battlegroup_player_villages battlegrou2_
on players1_.id=battlegrou2_.player_id -- ERROR: comparing integer to uuid
where
battlegrou0_.id=?
This results in an exception:
PSQLException: ERROR: operator does not exist: integer = uuid
Which makes perfect sense, since it is comparing the battlegroup_players id, which is a uuid, to the battlegroup_player_villages player_id, which is an integer. It should instead be comparing/joining on the battlegroup_player's player_id to the battlegroup_player_village's player_id.
If I change the sql to reflect that and manually execute the above query with the error line replaced:
on players1_.player_id=battlegrou2_.player_id
I get exactly the results I want. How can I change the OneToMany mappings so that it does exactly that?
Is it possible to do this without having a BattlegroupPlayerEntity object in my BattlegroupPlayerVillageEntity class?
Bonus points if you can get the left outer joins to become regular inner joins.
EDIT:
I tried the current answer, had to slightly adjust my embedded id because my code could not compile otherwise, should be the same thing:
#Embeddable
data class BattlegroupPlayerVillageId(
#Column(name = "battlegroup_id")
val battlegroupId: UUID,
#Column(name = "village_id")
val villageId: Int,
val x: Int,
val y: Int
): Serializable {
#ManyToOne
#JoinColumn(name = "player_id")
var player: BattlegroupPlayerEntity? = null
}
Using this still results in a comparison between int and uuid, for some reason.
Schema-validation: wrong column type encountered in column [player_id] in table [battlegroup_player_villages]; found [int4 (Types#INTEGER)], but expecting [uuid (Types#OTHER)]
Interestingly, if I try to put a referencedColumnName = "player_id" in there, I get a stackoverflow error instead.
I did some digging and found some issues with the mapping as well as classes, I will try to explain as much as possible.
WARNING!!! TL;DR
I will use Java for code, I hope that should not be a problem converting to kotlin.
There are some issues with classes also(hint: Serializable), so classes must implements Serializable.
Used lombok to reduce the boilerplate
Here is the changed BattleGroupPlayer entity:
#Entity
#Getter
#NoArgsConstructor
#Table(name = "battle_group")
public class BattleGroup implements Serializable {
private static final long serialVersionUID = 6396336405158170608L;
#Id
private UUID id;
private String name;
#OneToMany(mappedBy = "battleGroupId", cascade = CascadeType.ALL, orphanRemoval = true, fetch = FetchType.LAZY)
private List<BattleGroupPlayer> players = new ArrayList();
public BattleGroup(UUID id, String name) {
this.id = id;
this.name = name;
}
public void addPlayer(BattleGroupPlayer player) {
players.add(player);
}
}
and BattleGroupVillage and BattleGroupVillageId entity
#AllArgsConstructor
#Entity
#Getter
#NoArgsConstructor
#Table(name = "battle_group_village")
public class BattleGroupVillage implements Serializable {
private static final long serialVersionUID = -4928557296423893476L;
#EmbeddedId
private BattleGroupVillageId id;
private String name;
}
#Embeddable
#EqualsAndHashCode
#Getter
#NoArgsConstructor
public class BattleGroupVillageId implements Serializable {
private static final long serialVersionUID = -6375405007868923427L;
#Column(name = "battle_group_id")
private UUID battleGroupId;
#Column(name = "player_id")
private Integer playerId;
#Column(name = "village_id")
private Integer villageId;
public BattleGroupVillageId(UUID battleGroupId, Integer playerId, Integer villageId) {
this.battleGroupId = battleGroupId;
this.villageId = villageId;
this.playerId = playerId;
}
}
Now, Serializable needs to be implemented in every class as we have used #EmbeddedId which requires the container class to be Serializable as well, hence every parent class must implement serializable, otherwise it would give error.
Now, we can solve the problem using #JoinColumn annotation like below:
#OneToMany(cascade = CasacadeType.ALL, fetch =EAGER)
#JoinColumn(name = "player_id", referencedColumnName = "player_id")
private List<BattleGroupVillage> villages = new ArrayList<>();
name -> field in child table and referenceColumnName -> field in parent table.
This will join the column player_id column in both entities.
SELECT
battlegrou0_.id AS id1_0_0_,
battlegrou0_.name AS name2_0_0_,
players1_.battle_group_id AS battle_g2_1_1_,
players1_.id AS id1_1_1_,
players1_.id AS id1_1_2_,
players1_.battle_group_id AS battle_g2_1_2_,
players1_.player_id AS player_i3_1_2_,
villages2_.player_id AS player_i4_2_3_,
villages2_.battle_group_id AS battle_g1_2_3_,
villages2_.village_id AS village_2_2_3_,
villages2_.battle_group_id AS battle_g1_2_4_,
villages2_.player_id AS player_i4_2_4_,
villages2_.village_id AS village_2_2_4_,
villages2_.name AS name3_2_4_
FROM
battle_group battlegrou0_
LEFT OUTER JOIN
battle_group_player players1_ ON battlegrou0_.id = players1_.battle_group_id
LEFT OUTER JOIN
battle_group_village villages2_ ON players1_.player_id = villages2_.player_id
WHERE
battlegrou0_.id = 1;
But this would give 2 players if you check the BattleGroup#getPlayers() method, below is the test case to verify.
UUID battleGroupId = UUID.randomUUID();
doInTransaction( em -> {
BattleGroupPlayer player = new BattleGroupPlayer(UUID.randomUUID(), battleGroupId, 1);
BattleGroupVillageId villageId1 = new BattleGroupVillageId(
battleGroupId,
1,
1
);
BattleGroupVillageId villageId2 = new BattleGroupVillageId(
battleGroupId,
1,
2
);
BattleGroupVillage village1 = new BattleGroupVillage(villageId1, "Village 1");
BattleGroupVillage village2 = new BattleGroupVillage(villageId2, "Village 2");
player.addVillage(village1);
player.addVillage(village2);
BattleGroup battleGroup = new BattleGroup(battleGroupId, "Takeshi Castle");
battleGroup.addPlayer(player);
em.persist(battleGroup);
});
doInTransaction( em -> {
BattleGroup battleGroup = em.find(BattleGroup.class, battleGroupId);
assertNotNull(battleGroup);
assertEquals(2, battleGroup.getPlayers().size());
BattleGroupPlayer player = battleGroup.getPlayers().get(0);
assertEquals(2, player.getVillages().size());
});
If your use case was to get the single player from BattleGroup then you would have to use FETCH.LAZY, which is btw good for performance as well.
Why LAZY works?
Because LAZY loading will issue separate select statement when you really access them. EAGER will load whole graph, wherever you have it. It means, it will try to load all relationship mapped with this type, hence it will perform outer join (which may result in 2 rows for players as your criteria is unique because of villageId, which you cannot know before querying).
If you have more than 1 such fields i.e want join on battleGroupId as well, you would need this
#JoinColumns({
#JoinColumn(name = "player_id", referencedColumnName = "player_id"),
#JoinColumn(name = "battle_group_id", referencedColumnName = "battle_group_id")
}
)
NOTE: Used h2 in memory db for test case
I have 3 tables that have a hierarchical relationship:
Page (Grandmother)
public class Page extends BaseDAO {
#Id
#GeneratedValue(strategy = IDENTITY)
#Column(name = "page_id", unique = true, nullable = false)
public Integer getPageId() {
return this.pageId;
}
#OneToMany(fetch = FetchType.EAGER, mappedBy = "page", cascade=CascadeType.ALL, orphanRemoval=true)
#NotFound(action = NotFoundAction.IGNORE)
public Set<PageWell> getPageWells() {
return this.pageWells;
}
}
PageWell (Mother)
public class PageWell extends BaseDAO {
#Id
#Column(name = "page_well_id", unique = true, nullable = false)
public int getPageWellId() {
return this.pageWellId;
}
#ManyToOne(fetch = FetchType.EAGER)
#JoinColumn(name = "page_id", nullable = false)
public Page getPage() {
return this.page;
}
#OneToMany(fetch = FetchType.EAGER, mappedBy = "pageWell", cascade=CascadeType.ALL)
public Set<PageComponentAttribute> getPageComponentAttributes() {
return this.pageComponentAttributes;
}
}
PageComponentAttribute (Daughter)
public class PageComponentAttribute extends BaseDAO {
#Id
#GeneratedValue(strategy = IDENTITY)
#Column(name = "page_component_attribute_id", unique = true, nullable = false)
public Integer getPageComponentAttributeId() {
return this.pageComponentAttributeId;
}
#ManyToOne(fetch = FetchType.EAGER, cascade=CascadeType.ALL)
#JoinColumn(name = "page_well_id", nullable = false)
public PageWell getPageWell() {
return this.pageWell;
}
}
The primary keys for all three tables are AutoIncrement in MySQL. The expected behavior is that when I save the Page, all PageWell objects get saved, and all PageComponentAttribute objects also get saved.
For some reason, it is working correctly for the Grandmonther -> Daughter relationship. But in the case of the Mother -> Daughter relationship, the Daughter's foreign key is set to 0 every time. This was obviously causing a constraint violation. I have temporarily removed the FK constraint on that relationship, and the record makes it into the table, but the FK is still 0.
My save code looks like this:
Page page = getPage(request); //getPage() finds an instance of page, or creates and persists a new instance if none exists.
Set<PageWell> wells = page.getPageWells();
wells.clear(); //delete all related PageWell objects so we can re-create them from scratch
page = pageHome.merge(page);
wells = page.getPageWells();
PageWell pageWell;
// Now create a new PageWell and set up bi-directonal mapping with Page. This part works great.
pageWell = new PageWell();
pageWell.setPage(page);
wells.add(pageWell);
// Now do the exact same thing with the PageComponentAttribute objects
PageComponentAttribute pca = new PageComponentAttribute();
pca.setPageWell(pageWell);
pca.getPageWell().getPageComponentAttributes().add(pca);
// Now save the Page
page = pageHome.merge(page);
When I check the database, the FK in the PageComponentAttribute table is set to 0. Again, I have temporarily removed the FK constraint from MySQL just to allow the record to save without an exception, but besides that, what am I doing wrong?
I would try to do one of the things, or all:
1) Remove the cascade from the #ManyToOne. In general thats not a good idea to have it configured like that. It essentially makes sense only for #OneToMany and #OneToOne.
#ManyToOne(fetch = FetchType.EAGER)
#JoinColumn(name = "page_well_id", nullable = false)
public PageWell getPageWell() {
return this.pageWell;
}
2) Try using the Hibernate cascade configuration instead of the JPA one:
#OneToMany(fetch = FetchType.EAGER, mappedBy = "pageWell")
#Cascade(CascadeType.ALL)
public Set<PageComponentAttribute> getPageComponentAttributes() {
return this.pageComponentAttributes;
}
There may be some small differences, see: article
3) Not sure why you invoke merge twice on the page entity. I would just stick to one at the very end.
4) Last workaround that comes to my mind would be performing an explicit flush here:
pageWell = new PageWell();
pageWell.setPage(page);
wells.add(pageWell);
session.flush();
and then:
PageComponentAttribute pca = new PageComponentAttribute();
pca.setPageWell(pageWell);
pca.getPageWell().getPageComponentAttributes().add(pca);
session.merge(pageWell);
In theory, pageWell should have the primary already generated because of the flush and it should not be 0 anymore.
I wish i had a testing environment right now to test this properly.
In the unlikely chance that someone has made the same bone-headed mistake I've made, the problem was that the PageWell entity's primary key didn't have a Generation strategy. I added that and it fixed my problem.
#GeneratedValue(strategy = IDENTITY)
I'm having a hard time understanding this JPA behavior which to me doesn't seem to follow the specification.
I have 2 basic entities:
public class User {
#Id
#Column(name = "id", unique = true, nullable = false, length = 36)
#Access(AccessType.PROPERTY)
private ID id;
#OrderBy("sequence ASC")
#OneToMany(fetch = FetchType.LAZY, mappedBy = "user", cascade = { CascadeType.REMOVE })
private final Set<UserProfile> userprofiles = new HashSet<UserProfile>(0);
//Ommiting rest of fields since they aren't relevant
}
public class UserProfile {
#Id
#Column(name = "id", unique = true, nullable = false, length = 36)
#Access(AccessType.PROPERTY)
private ID id;
#NotNull
#ManyToOne(fetch = FetchType.LAZY, optional = false)
#JoinColumn(name = "userID", nullable = false, foreignKey = #ForeignKey(name = "FK_UserProfile_User"))
private User user;
//Ommiting rest of fields since they aren't relevant
}
As you can see I only have cascading set to REMOVE, the behavior will be the same if I don't have cascade set at all.
Now if I call:
User user = new User();
user.setId(UUIDGenerator.generateId());
UserProfile userProfile = new UserProfile();
userProfile.setId(UUIDGenerator.generateId());
userProfile.setUser(user);
user.getUserProfiles().add(userProfile);
em.merge(user);
merge will throw an exception.
I see Hibernate is executing a SQL query against the UserProfile table:
select userprofil0_.userProfileID as userProf1_4_0_, userprofil0_.profileID as profileI3_4_0_, userprofil0_.sequence as sequence2_4_0_, userprofil0_.userID as userID4_4_0_ from UserProfile userprofil0_ where userprofil0_.userProfileID=?
And then it will throw an exception
org.springframework.orm.jpa.JpaObjectRetrievalFailureException: Unable to find com.mytest.domain.UserProfile with id 6aaab891-872d-41e6-8362-314601324847;
Why is this query even called?
Since I don't have cascade type set to MERGE in userprofiles my expectation would be that JPA/Hibernate would simply ignore the entities inside userprofiles set and only insert/update the user record, doesn't this go against the JPA specs?
If I change cascadetype to MERGE things will work as expected and both User and UserProfile will be added to the database, so no problem there. What puzzles me is why is Hibernate querying the database and erroring out about an entity that's not supposed to be merged at all since I don't have it set to cascade.
This is more of an academic scenario that I ran into, of course I could simply clear the userprofiles set and things would work, but I'm trying to understand why the above behavior happens since I'm probably missing some crucial piece of information about how merge works. It seems it will always try to attach all entities to the session regardless cascade type being set or not.
Why is this query even called?
It's because you are trying to merge the entity, in JPA merge() is used to make the entity managed/attached. To "merge" User, JPA needs to still maintian the references it holds(UserProfile). In your case its not trying to persist UserProfile its trying to get a reference to it to merge User. Read here
If you use persist rather than merge this should not happen.
I have a device and device_group table, mapping by a device_group_mapping table as below
CREATE TABLE device_group_mapping
(
device_id character varying(64) NOT NULL,
device_group_id bigint NOT NULL,
CONSTRAINT "FK_device_group_mapping_device" FOREIGN KEY (device_id)
REFERENCES device (id) MATCH SIMPLE
ON UPDATE NO ACTION ON DELETE NO ACTION,
CONSTRAINT "FK_device_group_mapping_device_group" FOREIGN KEY (device_group_id)
REFERENCES device_group (id) MATCH SIMPLE
ON UPDATE NO ACTION ON DELETE NO ACTION
)
WITH (
OIDS=FALSE
);
The device and deviceGroup entity of openjpa as below
#Entity
#Table(name = "device")
public class Device implements Serializable
{
#ManyToMany(fetch = FetchType.LAZY)
#JoinTable(name = "device_group_mapping", joinColumns =
{#JoinColumn(name = "device_id", referencedColumnName = "id", nullable = false)}, inverseJoinColumns =
{#JoinColumn(name = "device_group_id", referencedColumnName = "id", nullable = false)})
private List<DeviceGroup> deviceGroupCollection;
}
#Entity
#Table(name = "device_group")
public class DeviceGroup implements Serializable
{
#ManyToMany(mappedBy = "deviceGroupCollection", fetch = FetchType.EAGER)
#OrderBy()
private List<Device> deviceCollection;
}
Due to the fetch type is lazy, I have to get the deviceGroupCollection as below code
#Override
#Transactional
public List<Device> findAllDevicesWithGroupMapping() throws Exception
{
List<Device> list = new ArrayList<Device>();
list = this.deviceDao.findAll();
for (Device device : list)
{
device.setDeviceGroupCollection(device.getDeviceGroupCollection());
}
return list;
}
However, this will be very slow when list of devices contains amount of devices.
I think maybe I could just find device entity by JPQL with fetch join the device_group, but don't know how to do it. According to openjpa spec., it doesn't support on clause and also nested fetch join.
The openjpa I currently used as below
<dependency>
<groupId>org.apache.openjpa</groupId>
<artifactId>openjpa-all</artifactId>
<version>2.2.2</version>
</dependency>
Any help is appreciated.
You use a fetch join an a ManyToMany like on any other association. You don't need any on clase, since the association mapping already defines how the two entities are linked to each other:
select d from Device d
left join fetch d.deviceGroupCollection
where ...
I have a situation that is quite similar to the one outlined in this question's diagram: JPA. JoinTable and two JoinColumns, although with different issues.
I have three tables: Function, Group, and Location. Currently, I have a join table set up between Location and Group using #JoinTable. It is #ManyToMany on both sides, and works perfectly fine.
I am attempting to add the constraint that no Location should be associated with more than one Group that has the same Function. So I added a column for Function to my join table in my SQL schema and a uniqueness constraint across the Location and Function columns, like so:
create table function_table (
id varchar(50),
primary key(id)
);
create table group_table (
id varchar(50),
function_id varchar(50) not null,
primary key(id)
);
alter table group_table add constraint FK_TO_FUNCTION foreign key (function_id) references function_table;
create table location_table (
id varchar(50),
primary key(id)
);
create table group_location_join (
location_id varchar(50) not null,
group_id varchar(50) not null,
function_id varchar(50) not null,
primary key(location_id, group_id, function_id),
unique(location_id, function_id)
);
alter table group_location_join add constraint FK_TO_LOCATION foreign key (location_id) references location_table;
alter table group_location_join add constraint FK_TO_GROUP foreign key (group_id) references group_table;
alter table group_location_join add constraint FK_TO_FUNCTION foreign key (function_id) references function_table;
I then attempted to set up the following in my model entities:
#Entity
#Table(name = "function_table")
public class Function {
#Id
#Column(name = "id", length = 50)
private String id;
}
#Entity
#Table(name = "group_table")
public class Group {
#Id
#Column(name = "id", length = 50)
private String id;
#ManyToOne
#JoinColumn(name = "function_id", referencedColumnName = "id", nullable = false)
private Function function;
#ManyToMany
#JoinTable(name = "group_location_join",
joinColumns = {#JoinColumn(name = "group_id", referencedColumnName = "id"),
#JoinColumn(name = "function_id", referencedColumnName = "function_id")},
inverseJoinColumns = #JoinColumn(name="location_id", referencedColumnName = "id"))
private Set<Location> locations;
}
#Entity
#Table(name = "location_table")
public class Location {
#Id
#Column(name = "id", length = 50)
private String id;
#ManyToMany
#JoinTable(name = "group_location_join",
joinColumns = #JoinColumn(name="location_id", referencedColumnName = "id")
inverseJoinColumns = {#JoinColumn(name = "group_id", referencedColumnName = "id"),
#JoinColumn(name = "function_id", referencedColumnName = "function_id")})
private Set<Group> groups;
}
(Obviously, there is more to these entities, but I stripped them down to only the parts relevant to this question.)
This does not work. When I write a simple test to create a Location associated with a Group that is associated with a Function, the minute I try to flush the session to commit the transaction, Hibernate gives me this:
java.lang.ClassCastException: my.package.Group cannot be cast to java.io.Serializable
I think what's happening is that Hibernate is getting confused, throwing up its hands, and saying "I'll just serialize it, send it to the database, and hope it knows what's going on."
When I add implements Serializable and add a serialVersionUID to Group, I then get this:
org.hibernate.exception.SQLGrammarException: user lacks privilege or object not found: FUNCTION_ID
I'm not really sure how to proceed at this point, or if perhaps I have already proceeded too far down the wrong path. Maybe I'm not thinking about the SQL correctly, and there is a much easier way to ensure this constraint that doesn't involve all this ridiculousness.
Edit: In my system, the DAOs for the tables involved have no save capabilities. Which means that as long as my constraint is set up in the database, my application doesn't care; it can't insert things that violate the constraint because it can't insert things at all.
Edit 2: I never originally solved the stated problem, and instead simply added a third column in my database schema without touching the Java code, as stated in my first Edit section above. But I have since experimented with creating an explicit join table object with an #Embedded compound key, and it seems to work.
You are trying to create a composite primary key. In Hibernate you can do it using the #Embeddable annotation. In the example below you can find the way to use a composite key for two entities.
I believe you can move forward with this example and create your own version of primary key.
Mapping ManyToMany with composite Primary key and Annotation: