So I have a problem in jooq about getting data in spring-boot transaction.
I use a transaction to save the base data, and then I want to use jooq to get these data. But I found that what I fetched is null.
String sql = dslContext.select().from(Tables.SALES_INVENTORY).getSQL();
System.out.println(sql);
Result<Record> fetch1 = dslContext.select().from(Tables.SALES_INVENTORY).fetch();
System.out.println(fetch1);
String groupbySql =
dslContext
.select(Tables.SALES_INVENTORY.ITEM_ID, sum(Tables.SALES_INVENTORY.ON_HAND_QTY))
.from(Tables.SALES_INVENTORY)
.groupBy(Tables.SALES_INVENTORY.ITEM_ID)
.getSQL();
System.out.println(groupbySql);
Result<Record2<UUID, BigDecimal>> fetch =
dslContext
.select(Tables.SALES_INVENTORY.ITEM_ID, sum(Tables.SALES_INVENTORY.ON_HAND_QTY))
.from(Tables.SALES_INVENTORY)
.groupBy(Tables.SALES_INVENTORY.ITEM_ID)
.fetch();
System.out.println(fetch);
List<SalesInventoryEntity> all = salesInventoryRepository.findAll();
all.forEach(s -> System.out.println(s));
Jooq's SQL is correct, but can't find and data return as if I use JPA #Transactional to do my test method. And I use jpa-repository to get data, it found the right data.
So my main problem is how can I get the right data in JPA transactional?
Here is what I used to init the base data. It's a test method, and its class is #Transactional, so this means that the method is also #Transactional?
public void initSalesInventories() {
List<SalesInventoryEntity> salesInventories = Lists.newArrayList();
ItemEntity itemEntity = itemRepository.findById(itemId1).get();
int i = 0;
for (StockLocationEntity stockLocationEntity : stockLocationEntities) {
SalesInventoryEntity salesInventoryEntity = new SalesInventoryEntity();
salesInventoryEntity.setStockLocation(stockLocationEntity);
salesInventoryEntity.setSalesOrganization(usSalesOrgEntity);
salesInventoryEntity.setItem(itemEntity);
salesInventoryEntity.setItemClass(ItemClass.SALEABLE);
salesInventoryEntity.setOnHandQty(100);
salesInventoryEntity.setReservedQty(0);
salesInventoryEntity.setAvailableQty(100);
salesInventoryEntity.setLeadTime(5);
DocumentType[] values = DocumentType.values();
salesInventoryEntity.setDocType(values[i % 7]);
String code = "TO-201906112010000" + i;
i++;
salesInventoryEntity.setDocCode(code);
salesInventories.add(salesInventoryEntity);
}
salesInventoryRepository.saveAll(salesInventories);
}
After init my base data, I used jooq to read the data, and find nothing. I don't know whether jooq can't read other transaction's data or jooq just read the actual data from the database. If you know something about it please give me some advices.
Related
I have the below Stream class that is getting returned from DB:
Stream<Transaction> transctions=transRepository.findByTransctionId();
public class Transaction{
String transctionId;
String accountId;
String transName;
String accountName;
}
Now my Requirement is as below:
Transaction entity has 4 fields. So, from DB all the 4 fields were fetched by Jpa.
But client who needs this data ,he has sent the columnsName in list that he is looking from Transaction model
List<String> columnNames=Arrays.asList("transctionId","accountName")
I have post this data to Kafka.I have to take each Transction from this stream post it to kafka.
But cline is looking for only this 2 fields "transctionId","accountName" should go as part of Transaction in Kafka instead of all 4 fields.
The data should go in form of json to Kafa having below format:
{
"transctionId":"1234",
"accountName" :"test-account"
}
Basically only those fields should go to kafka which they have asked for instead of converting the whole pojo to json and send it.
Is there any way to achieve that?
If you need to invoke a method, but you only have its name, the only way I know is via reflection. I would do it like this:
Stream<Transaction> transctions=transRepository.findByTransctionId();
List<Transaction> outTransactions = new ArrayList<>();
List<String> columnNames = new ArrayList<>();
transactions.forEach(tr -> {
Transaction outTransaction = new Transaction();
columnNames.forEach( col -> {
try {
var getMethod = tr.getClass().getMethod("get" + StringUtils.capitalize(col));
Object value = getMethod.invoke(tr);
String valueStr = value instanceof String ? value.toString() : "";
var setMethod = outTransaction.getClass().getMethod("set" + StringUtils.capitalize(col));
setMethod.invoke(outTransaction, valueStr);
} catch (NoSuchMethodException | InvocationTargetException | IllegalAccessException e) {
e.printStackTrace();
}
});
outTransactions.add(outTransaction);
});
There is a lot of traversing, but with the requirements you have, this is the generic solution I can come up with. Another shortcoming to this solution is the creation of new Transaction objects. Which means that if Transactions are many, memory usage can grow. Maybe this solution can be optimised to take advantage of streaming the transactions from the DB.
Another way to do it is to have different endpoints for each known set of properties that the client sends you. For example:
#GetMapping("/transaction_id_and_name")
List<Transaction> getTransactionsIdAndName() {
... obtain Transactions, return a new list of Transactions, with transaction_id and name ... }
#GetMapping("/transaction_id_and_status")
List<Transaction> getTransactionsNameAndStatus() {...}
i am getiing proper values from DB but Getting Duplicate List Values while add list object to class object, in Spring Boot
Please suggest to me how to do it.
Get data from DB Code : Here Rooms is my DB Entity class
CriteriaBuilder roomsBuilder = roomSession.getCriteriaBuilder();
CriteriaQuery<Rooms> query = roomsBuilder.createQuery(Rooms.class);
Root<Rooms> root = query.from(Rooms.class);
Predicate userRestriction = roomsBuilder.or(roomsBuilder.notEqual(root.get(SmatrEntityParameters.IS_DELETED), "Y"),
roomsBuilder.isNull(root.get(SmatrEntityParameters.IS_DELETED)));
Predicate userRestriction2 = roomsBuilder.and(roomsBuilder.equal(root.join("properties").get(SmatrEntityParameters.PROPERTY_ID), propertyId));
query.where(roomsBuilder.and(userRestriction, userRestriction2));
Query q = roomSession.createQuery(query);
List<Rooms> getroomslistobj= q.getResultList();
Iterate the list code: Here getAllRoomsobj means main response pojo class
List<GetAllRooms> getallroomslistobj = new ArrayList<GetAllRooms>();
for (int i = 0; i < getroomslistobj.size(); i++) {
int dbroomId = getroomslistobj.get(i).getRoomId();
String dbroomName = getroomslistobj.get(i).getRoomName();
// Actual code start
getAllRoomsobj.setRoomId(dbroomId);
getAllRoomsobj.setRoomName(dbroomName);
getallroomslistobj.add(getAllRoomsobj);
// Actual code end
}
I tried one code at the middle of the Actual code but I did not want create a new object for the response class:
GetAllRooms object = new GetAllRooms();
object.setRoomId(dbroomId);
object.setRoomName(dbroomName);
getallroomslistobj.add(object);
Please Help me Out,
Thanks in Advance
u can try it by stream.map() of java8
I'm using JPA with a Native Query that returns about 13k registers. I thought to use Stream API from Java 8, but the result is not coming.
I Can't paginate because the result will populate a combo box
My repository sends the Stream
I added the #Transactional(readOnly = true) to make it work
#Query(value = "select * from mytable", native = true)
Stream<MyTable> getTableStream()
#Transactional(readOnly = true)
public Stream<MyTable> getTableStream() {
return repository.getTableStream()
}
#GetMapping(value = "/table", produces = MediaType.APPLICATION_STREAM_JSON_VALUE)
#Transactional(readOnly = true)
public ResponseEntity<Stream<MyTable>> getMailingClient() {
Stream<MyTable> body = service.getTableStream();
return ResponseEntity.ok(body);
}
All links and resources I found about stream do not show the implementation of the return JSON with Spring rest API.
My frontend is Angular 6, and nearest I got was one custom object with no result
Loading all 13k records into a combo box sounds like a very slow solution. I would recommend implementing a search based on like query. Something like this:
#Query("SELECT * FROM mytable WHERE name like ':name%'")
Stream<MyTable> getTableStream(#Param("name") String name);
But if you really want to load all of the records, you can use a java.util.Collection or java.util.List instead of a stream.
Collection<MyTable> getTableStream();
I am trying to pass in an ARRAY of BLOBs, but I am getting errors.
uploadFiles = new SimpleJdbcCall(dataSource).withCatalogName("FILES_PKG")
.withFunctionName("insertFiles").withReturnValue()
.declareParameters(new SqlParameter("p_userId", Types.NUMERIC),
new SqlParameter("p_data", Types.ARRAY, "BLOB_ARRAY"),
new SqlOutParameter("v_groupId", Types.NUMERIC));
uploadFiles.compile();
List<Blob> fileBlobs = new ArrayList<>();
for(int x = 0; x < byteFiles.size(); x++){
fileBlobs.add(new javax.sql.rowset.serial.SerialBlob(byteFiles.get(x)));
}
final Blob[] data = fileBlobs.toArray(new Blob[fileBlobs.size()]);
SqlParameterSource in = new MapSqlParameterSource()
.addValue("p_files", new SqlArrayValue<Blob>(data, "BLOB_ARRAY"))
.addValue("p_userId", userId);
Map<String, Object> results = uploadFiles.execute(in);
I created a SQL Type in the DB
create or replace TYPE BLOB_ARRAY is table of BLOB;
Function Spec
FUNCTION insertFiles(p_userId IN NUMBER,
p_files IN BLOB_ARRAY)
RETURN NUMBER;
Function Body
FUNCTION insertFiles (p_userId IN NUMBER,
p_files IN BLOB_ARRAY)
RETURN NUMBER
AS
v_groupId NUMBER := FILE_GROUP_ID_SEQ.NEXTVAL;
v_fileId NUMBER;
BEGIN
FOR i IN 1..p_files.COUNT
LOOP
v_fileId := FILE_ID_SEQ.NEXTVAL;
BEGIN
INSERT INTO FILES
(FILE_ID,
FILE_GROUP_ID,
FILE_DATA,
UPDT_USER_ID)
SELECT
v_fileId,
v_groupId,
p_files(i),
USER_ID
FROM USERS
WHERE USER_ID = p_userId;
EXCEPTION WHEN OTHERS THEN
v_groupId := -1;
END;
END LOOP;
RETURN v_groupId;
END insertFiles;
I am not sure how to correctly pass the array of Blobs to the SQL Function.
Error :
java.sql.SQLException: Fail to convert to internal representation:
javax.sql.rowset.serial.SerialBlob#87829c90 at
oracle.jdbc.oracore.OracleTypeBLOB.toDatum(OracleTypeBLOB.java:69)
~[ojdbc7.jar:12.1.0.1.0] at
oracle.jdbc.oracore.OracleType.toDatumArray(OracleType.java:176)
~[ojdbc7.jar:12.1.0.1.0] at
oracle.sql.ArrayDescriptor.toOracleArray(ArrayDescriptor.java:1321)
~[ojdbc7.jar:12.1.0.1.0] at oracle.sql.ARRAY.(ARRAY.java:140)
~[ojdbc7.jar:12.1.0.1.0] at
UPDATE
After trying Luke's suggestion, I am getting the following error:
uncategorized SQLException for SQL [{? = call FILES_PKG.INSERTFILES(?,
?)}]; SQL state [99999]; error code [22922]; ORA-22922: nonexistent
LOB value ; nested exception is java.sql.SQLException: ORA-22922:
nonexistent LOB value ] with root cause
java.sql.SQLException: ORA-22922: nonexistent LOB value
The error message appears to indicate the Oracle JDBC driver doesn't know what to do with the javax.sql.rowset.serial.SerialBlob object you've passed to it.
Try creating the Blob objects using Connection.createBlob instead. In other words, try replacing the following
loop
for(int x = 0; x < byteFiles.size(); x++){
fileBlobs.add(new javax.sql.rowset.serial.SerialBlob(byteFiles.get(x)));
}
with
Connection conn = dataSource.getConnection();
for(int x = 0; x < byteFiles.size(); x++){
Blob blob = conn.createBlob();
blob.setBytes(1, byteFiles.get(x));
fileBlobs.add(blob);
}
Also, make sure that your parameter names are consistent between your SimpleJdbcCall and your stored function. Your SimpleJdbcCall declares the BLOB array parameter with name p_data but your stored function declaration uses p_files. If the parameter names are not consistent you are likely to get an Invalid column type error.
However, had I run the above test with a stored function of my own that actually did something with the BLOB values passed in, instead of just hard-coding a return value, I might have found that this approach didn't work. I'm not sure why, I'd probably have to spend some time digging around in the guts of Spring to find out.
I tried replacing the Blob values with Spring SqlLobValues, but that didn't work either. I guess Spring's SqlArrayValue<T> type doesn't handle Spring wrapper objects for various JDBC types.
So I gave up on a Spring approach and went back to plain JDBC:
import oracle.jdbc.OracleConnection;
// ...
OracleConnection conn = dataSource.getConnection().unwrap(OracleConnection.class);
List<Blob> fileBlobs = new ArrayList<>();
for(int x = 0; x < byteFiles.size(); x++){
Blob blob = conn.createBlob();
blob.setBytes(1, byteFiles.get(x));
fileBlobs.add(blob);
}
Array array = conn.createOracleArray("BLOB_ARRAY",
fileBlobs.toArray(new Blob[fileBlobs.size()]));
CallableStatement cstmt = conn.prepareCall("{? = call insertFiles(?, ?)}");
cstmt.registerOutParameter(1, Types.NUMERIC);
cstmt.setInt(2, userId);
cstmt.setArray(3, array);
cstmt.execute();
int result = cstmt.getInt(1);
I've tested this with the stored function you've now included in your question, and it is able to call this function and insert the BLOBs into the database.
I'll leave it up to you to do what you see fit with the variable result and to add any necessary cleanup or transaction control.
However, while this approach worked, it didn't feel right. It didn't fit the Spring way of doing things. It did at least prove that what you were asking for was possible, in that there wasn't some limitation in the JDBC driver that meant you couldn't use BLOB arrays. I felt that there ought to be some way to call your function using Spring JDBC.
I spent some time looking into the ORA-22922 error and concluded that the underlying problem was that the Blob objects were being created using a different Connection to what was being used to execute the statement. The question then becomes how to get hold of the Connection Spring uses.
After some further digging around in the source code to various Spring classes, I realised that a more Spring-like way of doing this is to replace the SqlArrayValue<T> class with a different one specialised for BLOB arrays. This is what I ended up with:
import java.sql.Array;
import java.sql.Blob;
import java.sql.Connection;
import java.sql.SQLException;
import java.util.List;
import oracle.jdbc.OracleConnection;
import org.springframework.dao.InvalidDataAccessApiUsageException;
import org.springframework.jdbc.core.support.AbstractSqlTypeValue;
public class SqlBlobArrayValue extends AbstractSqlTypeValue {
private List<byte[]> values;
private String defaultTypeName;
public SqlBlobArrayValue(List<byte[]> values) {
this.values = values;
}
public SqlBlobArrayValue(List<byte[]> values, String defaultTypeName) {
this.values = values;
this.defaultTypeName = defaultTypeName;
}
protected Object createTypeValue(Connection conn, int sqlType, String typeName)
throws SQLException {
if (typeName == null && defaultTypeName == null) {
throw new InvalidDataAccessApiUsageException(
"The typeName is null in this context. Consider setting the defaultTypeName.");
}
Blob[] blobs = new Blob[values.size()];
for (int i = 0; i < blobs.length; ++i) {
Blob blob = conn.createBlob();
blob.setBytes(1, values.get(i));
blobs[i] = blob;
}
Array array = conn.unwrap(OracleConnection.class).createOracleArray(typeName != null ? typeName : defaultTypeName, blobs);
return array;
}
}
This class is heavily based on SqlArrayValue<T>, which is licensed under Version 2 of the Apache License. For brevity, I've omitted comments and a package directive.
With the help of this class, it becomes a lot easier to call your function using Spring JDBC. In fact, you can replace everything after the call to uploadFiles.compile() with the following:
SqlParameterSource in = new MapSqlParameterSource()
.addValue("p_files", new SqlBlobArrayValue(byteFiles, "BLOB_ARRAY"))
.addValue("p_userId", userId);
Map<String, Object> results = uploadFiles.execute(in);
I have a java project with a collection of unit tests that perform simple updates, deletes using JPA2. The unit tests run without a problem, and I can verify the changes in the database - all good. I attempt to copy/paste this same function in a handler (Smartfox Extension) - I recieve a rollback exception.
Column 'levelid' cannot be null.
Looking for suggestions as to why this might be. I can perform data reads from within this extension ( GetModelHandler ) but trying to set data does not work. It's completely baffling.
So in summary -
This works...
#Test
public void Save()
{
LevelDAO dao = new LevelDAO();
List levels = dao.findAll();
int i = levels.size();
Level l = new Level();
l.setName("test");
Layer y = new Layer();
y.setLayername("layer2");
EntityManagerHelper.beginTransaction();
dao.save(l);
EntityManagerHelper.commit();
}
This fails with rollback exception
public class SetModelHandler extends BaseClientRequestHandler
{
#Override
public void handleClientRequest(User sender, ISFSObject params)
{
LevelDAO dao = new LevelDAO();
List levels = dao.findAll();
int i = levels.size();
Level l = new Level();
l.setName("test");
Layer y = new Layer();
y.setLayername("layer2");
EntityManagerHelper.beginTransaction();
dao.save(l);
EntityManagerHelper.commit();
}
}
The Level and Layer class have a OneToMany and ManyToOne attribute respectively.
Any ideas appeciated.
Update
Here's the schema
Level
--------
levelid (int) PK
name (varchar)
Layer
--------
layerid (int) 11 PK
layername (varchar) 100
levelid (int)
Foreign Key Name:Level.levelid ,
On Delete: no action,
On Update: no action
When I changed
EntityManagerHelper.beginTransaction();
dao.update(l);
EntityManagerHelper.commit();
to
EntityManagerFactory factory = Persistence.createEntityManagerFactory("bwmodel");
EntityManager entityManager = factory.createEntityManager();
entityManager.getTransaction().begin();
dao.update(l);
entityManager.persist(l);
entityManager.getTransaction().commit();
This performs a save but not an update ? I'm missing something obvious here.
The most likely problem I can see would be different database definitions. Testing EJBs often use an in-memory database that is generated on the fly. Whereas in actual production you are using a real database which is probably enforcing constraints.
Try assigning the levelid value a value or changing the database schema.