Unique constraint violated - shouldn't be possible, but still... why? - java

I do have a model class:
public class LimitsModel {
private Long id;
private Long userId;
private Long channel;
}
I also have a unique constraint on my entity set on fields userId and channel. Throughtout the application, there's no chance those could duplicate.
The limits were added mid development, so we already had users and channels and had to create limits entity for every existing user. So we're creating them during some operation and there's no other place they're created. Here's how we create them:
List<LimitsModel> limits = userDAO.getUserLimits(userId, channel);
if(isNull(limits) || limits.isEmpty()){
List<limitsModel> limitsToSave = this.prepareLimits();
limits = userDAO.createOrUpdateLimits(limitsToSave);
}
.
.
.
other operations
What I'm getting is
Caused by: java.sql.SQLIntegrityConstraintViolationException: ORA-00001: unique constraint (USER_LIMITS_UNIQUE) violated
Any clues what could be the case? I'm simply drawing the limits from the database, checking if they exist and if not creating them. Where's the place for unique constraint violation?
EDIT
createOrUpdateLimits just calls this method:
public void createOrUpdateAll(final Collection<?> objects) {
getHibernateTemplate().executeWithNativeSession(session -> {
Iterator<?> iterator = objects.iterator();
while (iterator.hasNext()) {
session.saveOrUpdate(iterator.next());
}
return null;
});
}
prepareLimits nothing complicated, a simple builder:
private List<LimitsModel> prepareLimits() {
List<LimitsModel> limitsToSave = LimitsModel.CHANNELS.stream()
.map(channel -> LimitsModel.builder()
.userId(UserUtils.getId())
.channel(channel)
.build())
.collect(Collectors.toList());
return scaLimitsToSave;
}
getUserLimits:
public List<LimitsModel> getUserLimits(Long userId, Long channel) {
return getHibernateTemplate().execute(session -> {
final Criteria criteria = session.createCriteria(LimitsModel.class)
.add(Restrictions.eq(LimitsModel.PROPERTY_USER_ID, userId));
if (nonNull(channel)){
criteria.add(Restrictions.eq(LimitsModel.PROPERTY_CHANNEL, channel));
}
return criteria.list();
});
}
The constraint is on userId, channel. There is a possibility that the block that gets the limits and then creates them is called twice. Shouldn't the new limits be already in the database when it's called the second time? Isn't the transaction commited already?

Related

How to calculate a value into a own column without having to call the database for each record

I am trying to load data to the grid from another database table.
My grid is created from User entity
private Grid<User> grid = new Grid<>(User.class, false);
User has several columns like
private Long userId;
private String username;
private String email;
.
.
.
The User contains the record identifier from the second table (entity) too:
private Long organizationId;
So I added columns to the grid:
grid.addColumn(User::getUserId).setAutoWidth(true).setHeader("id");
grid.addColumn(User::getUsername).setAutoWidth(true).setHeader("login");
And I created my "own" column which using data from another table (entity):
grid.addColumn(user -> organizationService.findOrganizationNameByOrganizationId(user.getOrganizationId()).setAutoWidth(true).setHeader("organization name");
But problem is, that loading is very slow, because User table has about 500 000 rows and it send query to the database for each row ...
Is possible to load organization name by organization id defined in user entity in batch e.g. ?
First of all, I'd not bother with with enriching rows, in that way, if
I don't have to. If the solution is as simple as just joining the data
in the DB, creating a view, ... then just do that.
Yet, there are times, this has to be done efficiently, because the data
to enrich does not come from the same source.
I think the best place to do that, is as part of the lazy loading of the
data. Once you hold the stream in hand for the currently loaded page,
you can map over the stream and enrich the data, or do an batch load
etc. Pick what is most efficient.
If you have to do that a lot, make sure to provide useful tools inside
your repositories; for quickly adding things just for one grid, you can
as well highjack the DataProvider or it's successor.
Following an example (note the XXX):
#Route("")
class GridView extends Div {
GridView() {
add(new Grid<User>(User).tap {
setItems({ q ->
// XXX: this materializes the stream first to get all
// unique organizationId:s, batch-load the
// Organization, and finally transform the User
def users = FakeUserRepo.page(q.offset, q.limit).toList()
def orgaMap = FakeOrganizationRepo.batchLoad(users*.organizationId.toSet())
users.stream().map {
it.organizationName = orgaMap[it.organizationId].name; it
}
}, { q ->
FakeUserRepo.count()
})
})
}
}
#TupleConstructor
class Organization {
Integer id
String name
}
class FakeOrganizationRepo {
public static final Map<Integer, Organization> ORGANIZATIONS = (0..<5).collectEntries { [it, new Organization(it, "Organization $it")] }
static Map<Integer, Organization> batchLoad(Set<Integer> ids) {
ORGANIZATIONS.subMap(ids)
}
}
#TupleConstructor
class User {
Integer id
String name
Integer organizationId
String organizationName
}
class FakeUserRepo {
public static final Collection<User> USERS = (1..500_000).collect {
new User(it, "User $it", it % FakeOrganizationRepo.ORGANIZATIONS.size())
}
static int count() {
USERS.size()
}
static Stream<User> page(int offset, int limit) {
USERS.stream().skip(offset).limit(limit)
}
}

How do I fetch OneToMany mappedBy when creating the child entity before the parent?

I have a model like this:
class Message {
#Id
private UUID id;
// ...
#OneToMany(mappedBy = "messageId")
private List<Value> values;
}
class Value {
private UUID messageId;
}
The Value entities are being created in a JPA session, then, in another session, I create a Message in which I provide the id myself (which matches the messageId of existing Value).
After I have persisted the Message, when I try to call getValues() from it, I get null. What's the best way to solve this? Can I programmatically fetch the relation? Should I open another session?
Solution 1: explicitly initialize child entities during parent entity creation
Main idea of that solution is to create an additional method for loading Value entities by messageId in ValueRepository and use it explicitly to initialize values collection during Message entity creation.
Repository for loading Value entities:
public interface ValueRepository extends JpaRepository<ValueEntity, Long> {
#Query("SELECT v FROM ValueEntity v WHERE v.messageId = :messageId")
List<ValueEntity> findByMessageId(Long messageId);
}
Mesage creation and values collection initialization:
public Message createMessage() {
Message message = new Message();
message.setId(1L);
message.setValues(valueRepository.findByMessageId(message.getId()));
entityManager.persist(message);
return message;
}
Solution 2: perform Flush and Refresh
AfterMessage persists operation you can perform Flush operation, which will synchronize entity with database state, and then Refresh operation, which will reread the state of the given entity.
public Message createMessage() {
Message message = new Message();
message.setId(1L);
entityManager.persist(message);
entityManager.flush();
entityManager.refresh(message);
return message;
}
I think Solution 1 is preferable, it is better from a performance perspective because the flush operation can take additional time.
UPDATE:
In case of merge operation, use returned persisted entity for the refresh, instead of init object.
public Message createMessage() {
Message message = new Message();
message.setId(1L);
Message persistedMessage = entityManager.merge(message);
entityManager.flush();
entityManager.refresh(persistedMessage);
return persistedMessage;
}
Or better divide save and update operations
public Message saveOrUpdateMessage() {
Message message = entityManager.find(Message.class, 1L);
if (message == null) {
//handle new entity save
message = new Message();
message.setId(1L);
entityManager.persist(message);
entityManager.flush();
entityManager.refresh(message);
}
//handle existing entity update
ValueEntity valueEntity = new ValueEntity();
valueEntity.setId(2L);
valueEntity.setMessageId(message.getId());
entityManager.persist(valueEntity);
message.getValues().add(valueEntity);
return message;
}

Spring-JPA-data save entity UNIQUE INDEX(two columns ,not pk ) violated cann't throw Exception

my java code like this:
public class Monitorlog{
#Id
#GeneratedValue(strategy=GenerationType.IDENTITY)
int id;//pk
String sn; // unique constraints(sn,checkpoint)
String checkpoint;
}
public interface MonitorlogDao extends JpaRepository<Monitorlog, Integer>{}
#Service
public class MonitorlogService{
#Autowired
MonitorlogDao monitorlogDao;
public MonitorlogDao getMonitorlog(){
return monitorlogDao;
}
}
The test is like this:
public void testMonitorlogSerivce(){
Monitorlog m = new Monitorlog();
m.setSn("aa");
m.setCheckpoint("bb");
monitorlogService.getMonitorlogDao().save(m);
m = new Monitorlog();
m.setSn("aa");
m.setCheckpoint("bb");
// SQL insert failed here
try{
monitorlogService.getMonitorlogDao().save(m);
}catch(Exception e){
log("",e);
}
}
The secode save(m) is failed , it would like throw a exception, but not.
the m.getId() is 0, but no exception catched, why?
I use sprint boot ver 2.0.0M.
UNIQUE INDEX
CREATE UNIQUE INDEX [IND_IDNO_POSITIONKIND_CHECKPOINT_1] ON
[dbo].[MONITORLOG] ([CHECKPOINT] DESC, [SN] ASC)
WITH (IGNORE_DUP_KEY = ON)
Your provided data on second save violates the PK and/or UK rules of DB.
Always you have to give valid data which will comply the PK and/or UK(unique key) rules to persist in DB for your code.

How to store specific object in HashMap with iterating id as key

I am working on on a program. I was storing Users in a ArrayList, so I had a UserCollection class which is acting as the storage class for the Users. But seeing as the UserCollection is thought of as a 'database' each user entry in the database should have a unique id. Initially I had a userID as a field in the User class but now I'm trying to handle the id part in the UserCollection. If I were to use a hashmap, where the key would be the id, the value being the User how would I go on about iterating the id so that every time a new User is stored into the hashmap, they key keeps iterrating from 1 to n amount of users.I'm also using CRUD methods to store/remove/update etc the Users.
public class UserCollection{
Map<Integer, User> userMap = new HashMap<Integer,User>();
public User create(User user){
userMap.put(??,user) // not sure how to iterate the id or what to put in it
return user;
}
public User read(Integer keyID){
if(userMap.containsKey(keyID)){
return userMap.get(keyID); //something definitely wrong
}
}
//Other remaining CRUD methods after.
}
Originally I just had an ArrayList which held Users. But Because I realized in a database Users will have unique id's now I'm confused how I would handle them. If i handle them in the HashMap do I still need to have a userID field in the User class ?
You have asked a couple of questions here. I'll take each of them in turn:
How can I ensure that each user has a unique ID?
The simplest way to do this is to have a static field that keeps track of the largest generated id:
class User {
private static int largestID = 0;
private final int id = largestID++;
public int getID() {
return id;
}
...
}
This works but has plenty of problems with it. As soon as you store users & restart the programme, or want to reuse ids, it needs changing.
Another approach is to just find the largest id for existing customers from your collection. The following sample code uses Java 8:
class UserCollection {
private final Map<Integer,User> users = new HashMap<>();
public int nextID() {
return users.keySet().stream()
.mapToInt(n -> n).max().orElse(-1) + 1;
}
}
This is inefficient but probably good enough for a lot of applications.
Do I need to store the ID in the User class?
You have two options. Either you store it in the class and provide a mechanism for getting the ID (as above). Or you ensure that everything that deals with users stores and returns IDs, not references to objects. This allows the client of the method to then access the User using the ID.
Both of these are legitimate solutions. If you use the first (storing the ID in the class) then you should change your method for adding a User to your collection:
class UserCollection {
public void addUser(User user) {
users.put(user.getID(), user);
}
}
If you use the second, the ID field must be final because clients are relying on it not changing.
You could do the following to ensure that your Users have a unique ID for the life of the class.
public class UserCollection
{
private static int id = 0;
private static final Object lock = new Object();
private Map<Integer, User> userMap = new HashMap<Integer,User>();
public User create (User user)
{
// only add one user at a time to the map
synchronized (lock)
{
id++;
userMap.put(id, user);
}
return user;
}
// Rest of the class
}

Problem in managing bi-directional relationships in JPA: removing from a collection

My domain model has a self-referencing bi-directional relationship with relationship management done in the entity:
#Entity
public class Users implements BaseEntity<String>, Serializable {
#Id
private String username;
#ManyToMany(cascade = {CascadeType.REFRESH, CascadeType.MERGE, CascadeType.PERSIST})
private List<User> associatedSenders;
#ManyToMany(mappedBy = "associatedSenders")
private List<User> associatedReceivers;
//
// Associated Senders
//
public List<User> getAssociatedSenders() {
if (associatedSenders == null) {
associatedSenders = new ArrayList<User>();
}
return associatedSenders;
}
public void addAssociatedSender(User sender) {
if (associatedSenders == null) {
associatedSenders = new ArrayList<User>();
}
associatedSenders.add(checkNotNull(sender));
if (!sender.getAssociatedReceivers().contains(this)) {
sender.addAssociatedReceiver(this);
}
}
public void removeAssociatedSender(User sender) {
if (associatedSenders == null) {
associatedSenders = new ArrayList<User>();
}
associatedSenders.remove(checkNotNull(sender));
if (sender.getAssociatedReceivers().contains(this)) {
sender.removeAssociatedReceiver(this);
}
}
public void setAssociatedSenders(List<User> senders) {
checkNotNull(senders);
if (associatedSenders == null) {
associatedSenders = new ArrayList<User>();
}
// first remove all previous senders
for (Iterator<User> it = associatedSenders.iterator(); it.hasNext();) {
User sender = it.next();
it.remove();
if (sender.getAssociatedReceivers().contains(this)) {
sender.removeAssociatedReceiver(this);
}
}
// now add new senders
for (User sender : senders) {
addAssociatedSender(sender);
}
}
//
// Associated Receivers
//
public List<User> getAssociatedReceivers() {
if (associatedReceivers == null) {
associatedReceivers = new ArrayList<User>();
}
return associatedReceivers;
}
/**
* <p><b>Note:</b> this method should not be used by clients, because it
* does not manage the inverse side of the JPA relationship. Instead, use
* the appropriate method at the inverse of the relationship.
*
* #param receiver
*/
protected void addAssociatedReceiver(User receiver) {
if (associatedReceivers == null) {
associatedReceivers = new ArrayList<User>();
}
associatedReceivers.add(checkNotNull(receiver));
}
/**
* <p><b>Note:</b> this method should not be used by clients, because it
* does not manage the inverse side of the JPA relationship. Instead, use
* the appropriate method at the inverse of the relationship.
*
* #param receiver
*/
protected void removeAssociatedReceiver(User receiver) {
if (associatedReceivers == null) {
associatedReceivers = new ArrayList<User>();
}
associatedReceivers.remove(checkNotNull(receiver));
}
}
When I add new user entities to the associatedSenders collection, everything works as expected. The table in the db gets updated correctly and the in-memory relationships are correct, as well. However, when I remove a user entity from from associatedSenders collection (or all entities from that collection), e.g. by doing a call like this:
List<User> senders = Collections.emptyList();
user.setAssociatedSenders(senders)
the database table gets updated correctly, but the next call to em.find(User.class, username), where username is the id of the user who previously was in the associatedSenders collection, reveals that the associatedReceivers collection (the inverse side) has not been correctly updated. That is, user is still in that collection. Only if I refresh the entity via em.refresh(), the collection is correctly updated. Looks like the entity manager does some caching here, but this behavior seems incorrect to me.
UPDATE Probably it's worth mentioning that I'm modifying the user entity in the frontend within a JSF managed bean, i.e. while the entity is in the detached state.
If you are modifying the object while it is detached, then you must merge() it back into the persistence unit. Since you are modifying the source and target objects, you must merge() both of the objects to maintain both sides of the relationship. Cascade merge is not enough as you have removed the objects, so there is nothing to cascade to.
You could also check the state of your objects after the merge, and before and after the commit.
Perhaps include your merge code.
the only explanation I can figure out, user object that you set empty list to its associatedSender field is not original cached object, it is just a copy ...

Categories