I was trying to return an average and count of a set of ratings in one query.
I managed it fairly easily in two queries following the example I found browsing. For example:
#Query("SELECT AVG(rating) from UserVideoRating where videoId=:videoId")
public double findAverageByVideoId(#Param("videoId") long videoId);
but as soon as I wanted an average and a count in the same query, the trouble started. After many hours experimenting, I found this worked, so I am sharing it here. I hope it helps.
1) I needed a new class for the results:
The I had to reference that class in the query:
#Query("SELECT new org.magnum.mobilecloud.video.model.AggregateResults(AVG(rating) as rating, COUNT(rating) as TotalRatings) from UserVideoRating where videoId=:videoId")
public AggregateResults findAvgRatingByVideoId(#Param("videoId") long videoId);
One query now returns average rating and count of ratings
Solved myself.
Custom class to receive results:
public class AggregateResults {
private final double rating;
private final int totalRatings;
public AggregateResults(double rating, long totalRatings) {
this.rating = rating;
this.totalRatings = (int) totalRatings;
}
public double getRating() {
return rating;
}
public int getTotalRatings() {
return totalRatings;
}
}
and
#Query("SELECT new org.magnum.mobilecloud.video.model.AggregateResults(
AVG(rating) as rating,
COUNT(rating) as TotalRatings)
FROM UserVideoRating
WHERE videoId=:videoId")
public AggregateResults findAvgRatingByVideoId(#Param("videoId") long videoId);
Thanks.
You should prevent NPEs and hibernate parsing tuple errors as following :
public class AggregateResults {
private final double rating;
private final int totalRatings;
public AggregateResults(Double rating, Long totalRatings) {
this.rating = rating == null ? 0 : rating;
this.totalRatings = totalRatings == null ? 0 : totalRatings.intValue();
}
public double getRating() {
return rating;
}
public int getTotalRatings() {
return totalRatings;
}}
Related
I send a Get Postman request and i am receiving status 200. The problem is that I don't get any data about it, it gives me only : [] .
Probably my error is that I don't map the constructor for the DTO class I made.
As another option I guess I have to make a projection or something similar to this, can anyone explain to me how to map the constructor and give me the data I want? I'm so messed up and dealing with this error for 2 hours and nothing.
Repository:
#Repository
public interface ManagementRepository extends JpaRepository<Management,Long>,ManagementRepositoryCustom {
#Query(value = "SELECT wm.action_description " +
"FROM testdb.warehouse_management wm " +
"WHERE wm.action_description = :action_description ", nativeQuery = true)
List<StockRecoveryDTO> findByDog(#Param("action_description") String action_description);
}
DTO:
package com.example.dto;
public class StockRecoveryDTO {
private Long id_product;
private String date;
private int quantity_product;
private String action_description;
private Long id_action;
private String quantity;
public StockRecoveryDTO() {
}
public StockRecoveryDTO(Long id_product, String date, int quantity_product, String action_description, Long id_action, String quantity) {
this.id_product = id_product;
this.date = date;
this.quantity_product = quantity_product;
this.action_description = action_description;
this.id_action = id_action;
this.quantity = quantity;
}
//GETTER SETTER
Console:
Hibernate: SELECT wm.action_description FROM testdb.warehouse_management wm WHERE wm.action_description = ?
That native query you are return only action_description so change the return type to List<String>.
I tried to create the mysql code in #query, but failed to validate, so i want to make this sql code exactly the same as querydsl, can someone help me put the sum and the hypothesis in the dsl query? Do you have any documentation on this? or it is not possible to do so. Thank you for your time and your help will be invaluable! I'm really trying my best to deal with this problem.
Mysql:
SELECT id_product,date,
sum(case
when action_description = "import"
then quantity_product
else -quantity_product
END) as test
FROM testdb.warehouse_management
inner join product
on warehouse_management.product_id = product.id_product
where id_product = 3 and date <= 1200
group by id_product,date
Querydsl:
public class ManagementRepositoryImpl implements ManagementRepositoryCustom {
#PersistenceContext
private EntityManager entityManager;
#Override
public List<StockRecoveryDTO> findTotal(Long id_product, String date){
JPAQuery<StockRecoveryDTO> stockRecoveryDTOJPAQuery = new JPAQuery<>(entityManager);
QManagement management = QManagement.management;
QProduct product = QProduct.product;
return stockRecoveryDTOJPAQuery.select(Projections.bean(StockRecoveryDTO.class,
product.id_product,
management.date,
product.quantity_product,
management.action_description,
management.id_action,
management.quantity))
.from(management)
.innerJoin(product)
.on(management.product_id.eq(product)
DTO: package com.example.dto;
public class StockRecoveryDTO {
private Long id_product;
private String date;
private int quantity_product;
private String action_description;
private Long id_action;
private String quantity;
private int total;
public StockRecoveryDTO() {
}
public StockRecoveryDTO(Long id_product, String date, int quantity_product, String action_description, Long id_action, String quantity, int total) {
this.id_product = id_product;
this.date = date;
this.quantity_product = quantity_product;
this.action_description = action_description;
this.id_action = id_action;
this.quantity = quantity;
this.total = total;
}
//GETTER SETTER
I am trying to read a csv file into JavaRDD. In order to do that, I wrote the code below:
SparkConf conf = new SparkConf().setAppName("NameOfApp").setMaster("spark://Ip here:7077");
JavaSparkContext sc = new JavaSparkContext(conf);
JavaRDD<CurrencyPair> rdd_records = sc.textFile(System.getProperty("user.dir") + "/data/data.csv", 2).map(
new Function<String, CurrencyPair>() {
public CurrencyPair call(String line) throws Exception {
String[] fields = line.split(",");
CurrencyPair sd = new CurrencyPair(Integer.parseInt(fields[0].trim()), Double.parseDouble(fields[1].trim()),
Double.parseDouble(fields[2].trim()), Double.parseDouble(fields[3]), new Date(fields[4]));
return sd;
}
}
);
My data file looks like this:
1,0.034968,212285,7457.23,"2019-03-08 18:36:18"
Here, in order to check that if my data loaded correctly or not, I tried to print some of them:
System.out.println("Count: " + rdd_records.count());
List<CurrencyPair> list = rdd_records.top(5);
System.out.println(list.toString());
But I had following error at both system out lines. I tried each of them alone as well rather than printing count and list at the same time.
Caused by: java.lang.ClassCastException: cannot assign instance of java.lang.invoke.SerializedLambda to field org.apache.spark.rdd.MapPartitionsRDD.f of type scala.Function3 in instance of org.apache.spark.rdd.MapPartitionsRDD
My custom object looks like this:
public class CurrencyPair implements Serializable {
private int id;
private double value;
private double baseVolume;
private double quoteVolume;
private Date timeStamp;
public CurrencyPair(int id, double value, double baseVolume, double quoteVolume, Date timeStamp) {
this.id = id;
this.value = value;
this.baseVolume = baseVolume;
this.quoteVolume = quoteVolume;
this.timeStamp = timeStamp;
}
public int getId() {
return id;
}
public void setId(int id) {
this.id = id;
}
public double getValue() {
return value;
}
public void setValue(double value) {
this.value = value;
}
public double getBaseVolume() {
return baseVolume;
}
public void setBaseVolume(double baseVolume) {
this.baseVolume = baseVolume;
}
public double getQuoteVolume() {
return quoteVolume;
}
public void setQuoteVolume(double quoteVolume) {
this.quoteVolume = quoteVolume;
}
public Date getTimeStamp() {
return timeStamp;
}
public void setTimeStamp(Date timeStamp) {
this.timeStamp = timeStamp;
}
}
So I could not figured out what is wrong here. What am I doing wrong?
Edit: It works well when I write local instead of my own spark master IP. But I need to run this on my own IP. So what can be wrong with my master node?
The issue is probably the anonymous class definition new Function<String, CurrencyPair>() { which forces Spark to try to serialize the parent class as well. Try a lambda instead:
rdd_records.map(
(Function<String, CurrencyPair>) line -> {
...
Note: You could read the file as a CSV instead and use the dataset API with a bean encoder to skip the manual parsing completely.
Here is the problem I am currently having. My DAO class returns data in index but I want to return the selected data to client in JSON format. How can I do it?
The controller class which returns data in json format.
MyController.java
#RequestMapping(value = "/ohlc",method = RequestMethod.POST)
public #ResponseBody List<OhlcResponse> getOhlc(#RequestBody OhlcRequest ohlcRequest) {
List<OhlcResponse> ohlc = ohlcService.getOhlc(ohlcRequest);
return ohlc;
}
Dao class returns 4 datas (minprice, maxprice, closingprice and previousclosingprice) by executing stored procedure.
OhlcDaoImpl.java
public List<OhlcResponse> getOhlc(OhlcRequest ohlcRequest) {
session = sessionFactory.openSession();
SQLQuery q = session.createSQLQuery("EXEC uspGetOhlc :StockCode, :fromDate, :toDate");
q.setString("StockCode",ohlcRequest.getStockSymbol());
q.setDate("fromDate",ohlcRequest.getFromDate());
q.setDate("toDate", ohlcRequest.getToDate());
List<OhlcResponse> l = q.list();
return l;
}
My stored procedure
USE WealthFeedSrv
GO
CREATE PROCEDURE uspGetOhlc
#StockCode varchar(50),
#fromDate date,
#toDate date
AS
BEGIN
SELECT spd.ClosingPrice, spd.PreviousClosingPrice, spd.MinPrice, spd.MaxPrice
FROM StockPriceDetl spd
inner join Stock stk on stk.Id = spd.StockId
inner join StockPriceMast spm on spm.Id = spd.MastId
WHERE stk.StockSymbol= #StockCode AND spm.TranDate Between #fromDate and #toDate
END
GO
The pojo class I want returned data to bind.
OhlcResponse.java
public class OhlcResponse {
private BigDecimal MaxPrice;
private BigDecimal MinPrice;
private BigDecimal PreviousClosingPrice;
private BigDecimal ClosingPrice;
public BigDecimal getMaxPrice() {
return MaxPrice;
}
public void setMaxPrice(BigDecimal maxPrice) {
MaxPrice = maxPrice;
}
public BigDecimal getMinPrice() {
return MinPrice;
}
public void setMinPrice(BigDecimal minPrice) {
MinPrice = minPrice;
}
public BigDecimal getPreviousClosingPrice() {
return PreviousClosingPrice;
}
public void setPreviousClosingPrice(BigDecimal previousClosingPrice) {
PreviousClosingPrice = previousClosingPrice;
}
public BigDecimal getClosingPrice() {
return ClosingPrice;
}
public void setClosingPrice(BigDecimal closingPrice) {
ClosingPrice = closingPrice;
}
public OhlcResponse(){
}
}
The result i want
"maxPrice":"200",
"minPrice":"300",
"ClosingPrice":"400",
"PreviousClosingPrice":"500"
The error says cannot cast to OhlcResponse. How can I bind the object returned by OhlcDao into OhlcResponse class and return JSON Format?
Good evening! I am trying to set values from my query to wrapper class TestWrapper
TestWrapper class:
package com.bionic.wrappers;
public class TestWrapper {
private String name;
private int duration;
public TestWrapper(){
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
public int getDuration() {
return duration;
}
public void setDuration(int duration) {
this.duration = duration;
}
}
Here is my query:
#NamedQuery(name = "getAvailableTestsNames",
query = "SELECT test.testName, test.duration FROM Result result JOIN result.test test JOIN result.user user where user.id = :userId"
and DAO class:
public List<TestWrapper> getAvailableTestsNames(long id){
Query query = em.createNamedQuery("getAvailableTestsNames");
query.setParameter("userId", id);
return (List<TestWrapper>)query.getResultList();
}
I get an exeption and i see that values won't set appropriate here:
public static Set<TestDTO> convertAvailableTestsToDTO(List<TestWrapper> tests){
Set<TestDTO> testDTOs = new HashSet<>();
for (TestWrapper test : tests){
TestDTO testDTO = new TestDTO(test.getName(), test.getDuration());
testDTOs.add(testDTO);
}
return testDTOs;
}
I get an expeption:
java.lang.ClassCastException: [Ljava.lang.Object; cannot be cast to com.bionic.wrappers.TestWrapper
Thank you!
I don't have enough context but in the getAvailableTestsNames meth. looks like you're doing a query that returns scalar results by returning "test.testName, test.duration" where you probably just want to return a List of TestWrapper so the query should just be " from XXX" , you can omit the select field1,field2 ... hibernate does that for you.
See section 11.4.1.3. Scalar results of https://docs.jboss.org/hibernate/orm/4.3/manual/en-US/html/ch11.html#objectstate-querying vs. 11.4.1. Executing queries
Hope this helps
Aa.