Create mapped entity when you only have the id - java

I'm not sure how to phrase the question title to be honest, if someone has a suggestion, please let me know.
My use case is this, I have an entity with an account property like so (this is cleaned up to avoid clutter):
#Entity
#Table(name = "report_line", schema = "public")
public class ReportLine extends BaseReportLine {
#ManyToOne
#JoinColumn(name = "report_id")
private Report report;
#ManyToOne
#JoinColumn(name = "account_id")
private Account account;
}
But a DTO that only has an account id / different properties:
public class ImportLineDto {
public String groupName;
public Integer position;
public Integer parentPosition;
public String accountId;
public String name;
public BigDecimal amount;
public List<ImportLineDto> lines = new ArrayList<>();
}
I need to go through / flatten all lines so I can save it to a JPA repository, but there are 2 issues:
Is there a way to create the table line object using the accountId only, without having to look up the account for each line, as that will add a massive amount of unnecessary db calls.
What should I do with the 'lines' on each table object after flattening? Should I set them to null / empty list?
Is there a better way to do this? For once I can actually make changes to the code
Here is what I have so far:
private void saveReport(ImportedResult result) {
Report report = new Report();
...
report.setLines(getReportLinesFromDtoLines(result.lineItems.lines));
ReportRepository.saveAndFlush(report);
}
private List<ReportLine> getReportLinesFromDtoLines(ImportLineDto lines) {
List<ImportLineDto> flatLines = flatMapRecursive(lines).collect(Collectors.toList());
List<ReportLine> reportLines = new ArrayList<>();
for(ImportLineDto line: flatLines) {
ReportLine reportLine = new ReportLine();
reportLine.setItemText(line.name);
reportLine.setAmount(line.amount);
reportLine.setAccount(???);
// how do I set the 'Account' property using the id only, without looking up each account?
reportLines.add(reportLine);
}
return ReportLines;
}
public Stream<ImportLineDto> flatMapRecursive(ImportLineDto item) {
if (item.lines == null) {
return Stream.empty();
}
return Stream.concat(Stream.of(item), item.lines.stream()
.flatMap(this::flatMapRecursive));
}
Follow up:
Just to throw a wrench in there, what if the DTO accountId was not the actual "id" field in the table, but another custom field, I have another situation like that, would it even be possible? I still need the answer the the 1st question however with a standard id.

you may use entityManager.getReference as explained here
reportLine.setAccount(entityManager.getReference(Account.class, line.accountId));

Related

Cannot use save() to insert when previously importing data using data.sql

I'm terribly sorry if I can't start another post which is connected to my previous one but my question is somewhat different.
I noticed that I really can save new data in my database as long as I never added data to the database by using the line spring.datasource.initialization-mode=always in my application.properties and made a data.sql file with a few insert statements. Once I insert the data using that file, I can access the data and show it to the user, but I can't create any new data because I get the following error
ERROR: duplicate key value violates unique constraint "joke_pkey"
Detail: Key (id)=(1) already exists.
Does anyone know how to help me with this? I'm doing an interview task and I am meant to first import data using the data.sql file and then later add some more data.
The post with my code is here:
Spring Boot using save never inserts a row inside of a Postgresql table
EDIT - someone recommended adding my code here directly and saying what I've tried.
I have tried to initialize the database with the application properties the way they are, then restarting the app but without the last line, and setting the spring.jpa.hibernate.ddl-auto to none. But even so, it didn't work. I genuinely expected it to work like that. Because if the table is empty and I fill it in using the functions I created, everything works like a charm, even after restarting the server (id keep the ring.jpa.hibernate.ddl-auto to none again to keep the data from being deleted)
I have also tried simply changing the GenerationType.AUTO to GenerationType.TABLE strategy in my Joke class, but that didn't seem to change anything either.
application.properties :
spring.datasource.url=jdbc:postgresql://localhost:5432/flyway_demo
spring.datasource.username=bob
spring.datasource.password=bob123
spring.jpa.properties.hibernate.dialect=org.hibernate.dialect.PostgreSQLDialect
spring.jpa.hibernate.ddl-auto=create
spring.datasource.initialization-mode=always
My Web Controller that has the post function:
#PostMapping("/post")
public String insertJoke(JokeForm jokeForm) {
int categoryid = jokeForm.getCategoryId();
String content = jokeForm.getContent();
databasController.insert(categoryid, content);
return "redirect:/";
}
My DBController whose insert function is being called
public Joke insert(int categoryid, String content) {
return jokeRepository.save(new Joke(categoryid, content));
}
Most of my Joke data class:
#Entity
public class Joke {
#Id
#GeneratedValue(strategy = GenerationType.AUTO)
#Column(columnDefinition = "serial")
private Long id;
#NotNull
#Column(name = "category_id_FK")
private long categoryId;
#NotBlank
private String content;
#Column(columnDefinition = "integer default 0")
private int likes = 0;
#Column(columnDefinition = "integer default 0")
private int dislikes = 0;
public Joke() {
}
public Joke(long categoryid, String content) {
this.setCategoryid(categoryid);
this.setContent(content);
}
// id
public Long getId() {
return this.id;
}
// id
public void setId(Long id) {
this.id = id;
}
// categoryid
public long getCategoryid() {
return this.categoryId;
}
public void setCategoryid(long categoryid) {
this.categoryId = categoryid;
}
// content
public String getContent() {
return this.content;
}
public void setContent(String content) {
this.content = content;
}
// likes
public int getLikes() {
return this.likes;
}
public void setLikes(int likes) {
this.likes = likes;
}
// dislikes
public int getDislikes() {
return this.dislikes;
}
public void setDislikes(int dislikes) {
this.dislikes = dislikes;
}
}
Joke Repository:
#Repository
public interface JokeRepository extends JpaRepository<Joke, Integer> {
Joke findById(long id);
List<Joke> findByCategoryid(int categoryid);
}
It seems that all you need to do is change GenerationType.AUTO to GenerationType.IDENTITY.
Reason behind this is the sequence, which might be out of sync if you use AUTO. Because then hibernate uses its own sequence instead of the one postgres creates when using serial.

Dynamodb versioing with dynamodb mapper is not working as expected

I am getting conditionalcheckfailed exception when trying to save/update items using dynamodb mapper.
Can anyone please share snippet of code using java that can demonstrate how versioning and optimistic locking can be implemented successfully?
Tried not setting version at all!!
Tried adding a record to table, and then doing read before save.
Nothing woked!! I continue to get ConditionalCheckFailed Exception.
Only thing works is if I set the config to COBBLER!! but that's not what I want as I need optimistic locking for my data.
DB item class---
#DynamoDBTable(tableName="Funds")
public class FundsItem {
private String id;
private String auditId;
private Long version;
private String shopId;
private String terminalId;
private String txId;
#DynamoDBHashKey(attributeName = "Id")
public String getId() {
return id;
}
public void setId(String id) {
this.id = id;
}
#DynamoDBRangeKey(attributeName = "AuditId")
public String getAuditId() {
return auditId;
}
public void setAuditId(String auditId) {
this.auditId = auditId;
}
#DynamoDBVersionAttribute(attributeName = "Version")
public Long getVersion() { return version; }
public void setVersion(Long version) { this.version = version; }
#DynamoDBAttribute(attributeName = "ShopId")
public String getShopId() {
return shopId;
}
public void setShopId(String shopId) {
this.shopId = shopId;
}
#DynamoDBAttribute(attributeName = "TerminalId")
public String getTerminalId() { return terminalId; }
public void setTerminalId(String terminalId) {
this.terminalId = terminalId;
}
#DynamoDBAttribute(attributeName = "TxId")
public String getTxId() {
return txId;
}
public void setTxId(String txId) {
this.txId = txId;
}
}
Code to save new item -----
public long addFunds(FundsRequest request){
FundsItem dbItem = new FundsItem();
String Id = request.getShopId().trim() + request.getTerminalId().trim();
String V0_Audit_Rec = "V0_Audit_" + Id;
//save V0 item.
dbItem.setVersion((long) 1);
dbItem.setId(Id);
dbItem.setAuditId(V0_Audit_Rec);
dbItem.setShopId(request.getShopId().trim());
dbItem.setTerminalId(request.getTerminalId().trim());
dbItem.setTxId(request.getTxId().trim());
mapper.save(dbItem);
}
Pls check the snippet above - This is a new empty table.
hashkey - id, rangekey - auditId, VersionField - version.
I just want to be able to add a new record that's why not doing any read before saving a new item. If I can get this simple case i.e. adding a new /first record to the dynamodb table work, I can implement rest of the use cases too.
In general:
Never set your version, the SDK will initialise this if required.
Always try and load an item with your key first. If null is returned, create the item and save it. Else update the returned item and save it.
I know you mentioned you've tried the above. If its truely an empty table your code should work OK (minus the setting of the version).
A couple of things I would also do:
Don't set your version field with a custom attribute name. In theory this should be fine, but for the sake of making your code the same as the AWS examples, I would remove this, at least until you have it working.
Although I think you need to remove the setting of the version, I note you are casting to a long, not a Long. Again, unlikely to be an issue but just something to eliminate at least. i.e. if you insist of setting version use new Long(l).

ObjectBox: Get objects with specific relations

Expect the two entities Movie and Genre:
#Entity
public class Movie {
#Id
private long id;
private String name;
private ToMany<Genre> genres;
[...]
}
#Entity
public class Genre {
#Id
private long id;
private String name;
[...]
}
We all know how to create a relation and save it:
Movie movie = new Movie();
movie.setTitle("Star Wars");
movie.getGenres().add(new Genre("Sci-Fi");
box.put(movie);
but is there a possibility to query all Movie-objects with a specific Genre? Like
Box<Movie> box = boxStore.boxFor(Movie.class);
Query query = box.query()
.equal(Genre_.name, "Sci-Fi") // note that I want to query the Movie-Box with Genre-properties
.build();
List movies = query.find();
My goal is to find all movies with a specific genre in a simple way. Does anyone know how to do it or do I have to query all movies and filter the result on my own? Or do I have to adapt my entities in another way?
Update:
I prepared the correct marked answer below to a working example:
final Genre genreSciFi = genreBox.query().equal(Genre_.name, "Sci-Fi").build().findFirst();
List<Movie> filteredMovies = movieBox.query().filter(new QueryFilter<Movie>() {
#Override
public boolean keep(#NonNull Movie entity) {
return entity.getGenres().contains(genreSciFi);
}
}).build().find();
To make the contains-Method work correctly, override equals-Method in your Genre-Entity:
#Override
public boolean equals(Object obj) {
return obj instanceof Genre && ((Genre) obj).getId() == id && ((Genre) obj).getName().equals(name);
}
Unfortunately, this part of the API is not exposed in Java yet. We want to refactor the Query API very soon.
Until this is ready, you can workaround using query filtering. Example using Java/Kotlin-ish code for brevity:
Query query = movieBox.query().filter(movie -> {
return genres.contains(genre -> {
return "Sci-Fi".equals(genre.getName())
}
}).build()
(Will make it similar in Java with the next update.)

How to persist classes like java.util.Currency?

I am using hibernate to persist my data. It's a financial application and I am having a hard time persisting the most fundamental entity of the application which is 'Money'. I was using JodaMoney but it's immutable so I am not able to find a good way to persist it. And without persisting my money to database, there is no point of making the application. What would I do with the immutability when I can't even store the state of my object?
Then I started creating my own 'Money'(fields as BigDecimal amount and java.util.Currency for currency), I wanted to use 'Currency' of java.util . But, that doesn't have a public constructor so hibernate can not persist that.
Please guide me on how to deal with this?
EDIT1: The code for most basic class:
#Entity
public class BasicMoney {
#Embedded
#Id
private BigDecimal amount;
#Embedded
private Currency currency;
//getters and setters, other methods
}
Now, when I make an object of this class and try to store it into database, it doesn't work. Hibernate throws:
org.hibernate.InstantiationException: No default constructor for entity: : java.util.Currency
So, this is the problem which I am facing.
You could use a workaround
#Embedded
String currency; //store data as a string
public void setCurrency(Currency currency) {
this.currency = currency.getCurrencyCode(); //conversion to an actual currency
}
public Currency getCurrency() {
return Currency.getInstance(currency); //conversion to an actual currency
}
Instead of dealing with a currency string and an amount, teach Hibernate about your type instead. That way you would have:
private Money amount;
instead of
private BigDecimal amount;
private String currency;
so you don't need to convert all over the place.
Here's how I do it:
1) Instead of JodaMoney, use JavaMoney, the JSR-354 project that is expected to be included in Java 9. If you want to stick to JodaMoney, you don't need step #3 below.
2) Add this UserType library to your classpath
3) Create this simple class:
public class CustomPersistentMoneyAmountAndCurrency extends AbstractMultiColumnUserType<MonetaryAmount> {
private static final ColumnMapper<?, ?>[] COLUMN_MAPPERS = new ColumnMapper<?, ?>[] { new CustomStringColumnCurrencyUnitMapper(), new BigDecimalBigDecimalColumnMapper() };
private static final String[] PROPERTY_NAMES = new String[]{ "currency", "number" };
#Override
protected ColumnMapper<?, ?>[] getColumnMappers() {
return COLUMN_MAPPERS;
}
#Override
protected Money fromConvertedColumns(Object[] convertedColumns) {
CurrencyUnit currencyUnitPart = (CurrencyUnit) convertedColumns[0];
BigDecimal amountPart = (BigDecimal) convertedColumns[1];
return Money.of(amountPart, currencyUnitPart);
}
#Override
protected Object[] toConvertedColumns(MonetaryAmount value) {
return new Object[] { value.getCurrency(), value.getNumber().numberValue(BigDecimal.class) };
}
#Override
public String[] getPropertyNames() {
return PROPERTY_NAMES;
}
}
4) Now wherever you want to use this in your Entities you would do:
#TypeDefs(value = {
#TypeDef(name = "moneyAmountWithCurrencyType", typeClass = CustomPersistentMoneyAmountAndCurrency.class)
})
#Entity
#Table(name = "account_entry")
public class AccountEntry {
private Money referenceMoney;
...
#Basic( optional = false )
#Columns(columns = {
#Column( name = "reference_money_currency", nullable = false, length = 3 ),
#Column( name = "reference_money", nullable = false )
})
#Type(type = "moneyAmountWithCurrencyType")
public Money getReferenceMoney() {
return this.referenceMoney;
}
}
And that's it. You will get strong typing all throughout.
Hibernate does support java.util.Currency natively.
https://docs.jboss.org/hibernate/orm/3.6/reference/en-US/html/types.html#types-basic-value-currency
Sorry for the late answer, but I didnĀ“t see this option as an answer and I think is pretty elegant. You can use a JPA converter with autoapply:
#Converter(autoApply = true)
public class CurrencyConverter implements AttributeConverter<Currency, String> {
#Override
public String convertToDatabaseColumn(Currency currency) {
return currency.getCurrencyCode();
}
#Override
public Currency convertToEntityAttribute(String currencyCode) {
return Currency.getInstance(currencyCode);
}
}

ModelMapper integration with Jooq Record

===== POJO =====
// Employee POJO
#JsonInclude(JsonInclude.Include.NON_NULL)
#JsonNaming(PropertyNamingStrategy.LowerCaseWithUnderscoresStrategy.class)
public class Employee implements Serializable {
private Integer id;
private String name;
private Integer companyId;
// assume getters ,setters and serializable implementations.
}
// Company POJO
#JsonInclude(JsonInclude.Include.NON_NULL)
#JsonNaming(PropertyNamingStrategy.LowerCaseWithUnderscoresStrategy.class)
public class Company implements Serializable {
private Integer id;
private String name;
// assume getters ,setters and serializable implementations.
}
// EmployeeVO POJO
#JsonInclude(JsonInclude.Include.NON_NULL)
#JsonNaming(PropertyNamingStrategy.LowerCaseWithUnderscoresStrategy.class)
public class EmployeeVO implements Serializable {
private Employee employee;
private Company company;
// assume getters ,setters and serializable implementations.
}
===== My DAO layer class =====
public List<EmployeeVO> getEmployees(){
// configuring model mapper.
ModelMapper modelMapper = new ModelMapper();
modelMapper.getConfiguration()
.addValueReader(new RecordValueReader())
.setSourceNameTokenizer(NameTokenizers.UNDERSCORE);
//property map configuration.
PropertyMap<Record, EmployeeVO> employeeVOMap = new PropertyMap<Record, EmployeeVO>() {
protected void configure() {
map().getEmployee().setName(this.<String>source("name"));
map().getEmployee()..setId(this.<Integer>source("id"));
map().getCompany().setName(this.<String>source("comp_name"));
map().getCompany().setId(this.<String>source("comp_id"));
}
};
// TypeMap config
modelMapper.createTypeMap(Record.class, EmployeeVO.class);
// adding employeeVOMap .
modelMapper.addMappings(employeeVOMap);
// JOOQ query
List<Field<?>> fields = Lists.newArrayList();
// fields includes, id, name, comp_name, comp_id
SelectJoinStep query = select(dslContext, fields).from(EMPLOYEE)
.join(COMPANY)
.on(COMPANY.ID.equal(EMPLOYEE.COMPANY_ID));
Result<Record> records = query.fetch();
Record record = null;
Iterator<Record> it = records.iterator();
List<EmployeeVO> employeeList= Lists.newArrayList();
while (it.hasNext()) {
record = it.next();
EmployeeVO employeeVOObj =
modelMapper.map(record, EmployeeVO.class);
employeeList.add(employeeVOObj);
}
return employeeList;
}
===== Error log =====
1) Error mapping org.jooq.impl.RecordImpl to com.myportal.bingo.db.model.EmployeeVO
1 error] with root cause
java.lang.ArrayIndexOutOfBoundsException: -1
Note:
ModelMapper throws the above exception when it reaches below method.
private void matchSource(TypeInfo<?> sourceTypeInfo, Mutator destinationMutator)
in ImplicitMappingBuilder.java
sourceTypeInfo.getAccessors() is null.
Any help?
Had the same problem, or at least which looked the same. (You can move directly to my solution in the last paragraph.) Lots of debugging have shown the following:
if accessors on that line (mentioned in your question) are null, then accessors = PropertyInfoSetResolver.resolveAccessors(source, type, configuration) line in TypeInfoImpl class is executed, and the reason of exception in my case was this call:
valueReader.get(source, memberName) at the following piece of code at 'resolveAccessors' method in the PropertyInfoSetResolver class:
if (valueReader == null)
resolveProperties(type, true, configuration, accessors);
else {
NameTransformer nameTransformer = configuration.getSourceNameTransformer();
for (String memberName : valueReader.memberNames(source))
accessors.put(nameTransformer.transform(memberName, NameableType.GENERIC),
new ValueReaderPropertyInfo(valueReader, valueReader.get(source, memberName),
memberName));
which ends up in source.getValue(memberName.toUpperCase()), where source is JOOQ's Record; InvoiceRecord in my case. And - tada - for some reason invoice.getValue("INVOICE_ID") ends up in the exception (no such field and therefore indexOf returns -1 which causes the ArrayIndexOutOfBoundsException), while invoice.getValue("invoice_id") is totally fine.
So else condition (the same piece of code above) wasn't the right way to execute the code, and if case turned out to be ok.
So that's what helped me in my particular case: removing of the row modelMapper.getConfiguration().addValueReader(new RecordValueReader()). Hope this will help you too.

Categories