When I try to execute a query on a GSI on DynamoDB, the following error is reported:
java.lang.IllegalArgumentException: A query conditional requires a sort key to be present on the table or index being queried, yet none have been defined in the model
What I should do in order to define this required sort key?
Here are some pieces of my code:
package com.test;
import lombok.AllArgsConstructor;
import lombok.Builder;
import lombok.Data;
import lombok.EqualsAndHashCode;
import lombok.NoArgsConstructor;
import lombok.ToString;
import software.amazon.awssdk.enhanced.dynamodb.mapper.annotations.*;
import java.time.Instant;
#Data
#Builder
#DynamoDbBean
#NoArgsConstructor
#AllArgsConstructor
#ToString
public class Accumulator {
private Instant createdAt;
private String userId;
private String transactionIds;
private String paymentAt;
private String type;
private Instant startAt;
private Instant endAt;
private Integer ordersTotal;
private Integer retries;
private String sortKey;
#DynamoDbPartitionKey
public String getUserId() {
return userId;
}
#DynamoDbSecondaryPartitionKey(indexNames= {"idx_by_payment_at"})
#DynamoDbSortKey
public String getPaymentAt() {
return paymentAt.toString();
}
}
Creating my table at DynamoDBConfig:
public DynamoDbAsyncTable<Accumulator> tableLocal(final DynamoDbEnhancedAsyncClient dynamoDbEnhancedAsyncClient) {
var tableAccumulator = dynamoDbEnhancedAsyncClient.table(tablename, BeanTableSchema.create(Accumulator.class));
try {
tableAccumulator.createTable(CreateTableEnhancedRequest.builder()
.provisionedThroughput(
ProvisionedThroughput.builder()
.writeCapacityUnits(5L).readCapacityUnits(5L)
.build())
.globalSecondaryIndices(
EnhancedGlobalSecondaryIndex.builder()
.indexName("idx_by_payment_at")
.projection(p -> p.projectionType(ProjectionType.KEYS_ONLY))
.provisionedThroughput(ProvisionedThroughput.builder()
.writeCapacityUnits(5L).readCapacityUnits(5L)
.build())
.build()
)
.build()).get();
Thread.sleep(3000);
} catch (InterruptedException | ExecutionException e) {
log.info("Skipping");
}
return tableAccumulator;
}
And my query:
DynamoDbTable<Accumulator> table = dynamoDbEnhancedClient.table(tableName, TableSchema.fromBean(Accumulator.class));
DynamoDbIndex<Accumulator> index = table.index("idx_by_payment_at");
QueryConditional queryConditional = QueryConditional.sortBetween(
Key.builder()
.partitionValue("2022-01-23T06:10:12.948334Z")
.build(),
Key.builder()
.partitionValue("2022-01-23T06:10:22.515769Z")
.build());
SdkIterable<Page<Accumulator>> query = index.query(queryConditional);
List<Page<Accumulator>> pages = query.stream().toList();
pages.forEach(page -> {
List<Accumulator> accumulators = page.items();
accumulators.stream().forEach(accumulator -> {
System.out.println(accumulator);
});
});
Thanks for any help.
From OP comment:
My intent is to query between 2 payment_at of one user and one type.
If that is the case, you'll need a GSI with
a partition (hash) that is a combination of user and type;
and a range (sort) that is the payment_at.
This will allow you to perform the access you require. The projection will depend on your use.
As it stands your primary table keys will allow you to select a range of payment_at for a given user. It might suffice to query that. Or it may be that you could use a sort key that is a compound of type and payment_at. The best choice really depends on your access patterns.
Related
I'm quite new to elasticsearch and spring-data combo.
Let me give you some background. I started from this entity
#Document(indexName = "my-entity")
public class MyEntity {
#Id
private String id;
private String someField;
}
A few of them got created and indexed. Later on, I added a new field to the entity. Now it looks like this:
#Document(indexName = "my-entity")
public class MyEntity {
#Id
private String id;
private String someField;
#Field(type = FieldType.Date, format = DateFormat.date_time)
private Date date;
}
What I'm trying to achieve is to return all entities sorted by my new field (which not all existing entities have).
Initially I've adjust my Repository and define something similar to List<MyEntity> findAllByOrderByDate();, but when I try to call it, I get the following exception:
"No mapping found for [date] in order to sort on"
I am aware that a possible solution would be to make use of ignore_unmapped option from elasticsearch, but my question is: how do I achieve this with spring-data-elasticsearch?
On the other hand, I'm not fixed up on using the Repository approach - I'm open for solutions.
Thank you!
When you add a new field to sort on, the mapping in Elasticsearch for the index needs to be updated, otherwise Elasticsearch what type this field has. Spring Data Elasticsearch repositories do not do an automatic update of the mapping.
The following is a way to check if the mapping of a class (in this example named Foo) needs an update:
import com.fasterxml.jackson.databind.ObjectMapper;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.data.elasticsearch.core.ElasticsearchOperations;
import org.springframework.data.elasticsearch.core.document.Document;
import org.springframework.stereotype.Component;
/**
* #author P.J. Meisch (pj.meisch#sothawo.com)
*/
#Component
public class FooMappingValidator {
private static final Logger LOGGER = LoggerFactory.getLogger(FooMappingValidator.class);
private final ObjectMapper objectMapper = new ObjectMapper();
private final ElasticsearchOperations operations;
public FooMappingValidator(ElasticsearchOperations operations) {
this.operations = operations;
}
#Autowired
public void checkFooMapping() {
var indexOperations = operations.indexOps(Foo.class);
if (indexOperations.exists()) {
LOGGER.info("checking if mapping for Foo changed");
var mappingFromEntity = indexOperations.createMapping();
var mappingFromEntityNode = objectMapper.valueToTree(mappingFromEntity);
var mappingFromIndexNode = objectMapper.valueToTree(indexOperations.getMapping());
if (!mappingFromEntityNode.equals(mappingFromIndexNode)) {
LOGGER.info("mapping for class Foo changed!");
indexOperations.putMapping(mappingFromEntity);
}
}
}
}
I have REST api, and when client call POST request with body, backend after deserialize should distinguish null from the absence of a value.
Because if value in JSON is null, then value in DB should become null.
If value in JSON absence, then value in DB should remain unchanged.
JSON:
{
"id" : 1,
"name" : "sample name",
"value" : null
}
OR
{
"id" : 1,
"name" : "sample name"
}
For Java after deserialization it is look like : value = null;
Java:
#Entity
#Table("sample")
public class Sample {
#Id
#Column
private Long id;
#Column
private String name;
#Column
private Integer value;
// getters / setters
}
Sample REST request:
#PutMapping
public ResponseEntity<SampleDto> updateSample(#RequestBody SampleDto dto) {
return ResponseEntity.ok(service.updateSample(dto));
}
Sample service impl:
public SampleDto updateSample(SampleDto dto) {
Sample sample = sampleRepository.findById(dto.getId);
sample.setName(dto.getName());
sample.setValue(dto.getValue());
//In this operation back need understand: value is null or absence
//Because if value in JSON is null, then value in DB should become null
//If value in JSON absence, then value in DB should remain unchanged
Sample newSample = sampleRepository.save(sample);
return modelMapper.map(newSample, SampleDto.class);
}
Project use Spring Data.
Maybe I should use #JsonDeserialize annotation or other Hibernate annotation
I tried use #JsonDeserialize, but it is not solution.
Partial update is different from full-resource update and we should implement it in a different way. Let's create two request POJO classes. One class will be used to create and update resources, second will be used to partially update given resource. To emphasise it we will use different HTTP methods. To distinguish null from absence we can use java.util.Optional class.
SampleCompleteRequest class we use together with POST (create) and PUT (update) methods.
SamplePartialRequest class we use together with PATCH (partially update) method.
To avoid boilerplate code in this example I'm using Lombok and MapStruct but it is not required.
Model
import jakarta.validation.constraints.NotBlank;
import lombok.Data;
#Data
public class SampleCompleteRequest {
#NotBlank
private String name;
private String value;
}
import jakarta.validation.constraints.NotBlank;
import lombok.Data;
import java.util.Optional;
#Data
public class SamplePartialRequest {
private Optional<#NotBlank String> name;
private Optional<String> value;
}
import lombok.Data;
#Data
public class SampleResponse {
private Long id;
private String name;
private String value;
}
import lombok.Data;
#Data
public class Sample {
//#Id - Hibernate annotations are removed
private Long id;
private String name;
private String value;
}
MapStruct
In MapStruct we need to define an interface with all methods we need.
import com.example.demo.model.SampleCompleteRequest;
import com.example.demo.model.SamplePartialRequest;
import com.example.demo.model.SampleResponse;
import jakarta.annotation.Nullable;
import org.mapstruct.BeanMapping;
import org.mapstruct.Mapper;
import org.mapstruct.MappingTarget;
import org.mapstruct.ReportingPolicy;
import java.util.Optional;
import static org.mapstruct.MappingConstants.ComponentModel.SPRING;
import static org.mapstruct.NullValueCheckStrategy.ALWAYS;
import static org.mapstruct.NullValuePropertyMappingStrategy.IGNORE;
#Mapper(unmappedTargetPolicy = ReportingPolicy.IGNORE, componentModel = SPRING)
public interface SamplesMapper {
#BeanMapping(nullValueCheckStrategy = ALWAYS, nullValuePropertyMappingStrategy = IGNORE)
Sample patch(SamplePartialRequest input, #MappingTarget Sample target);
Sample update(SampleCompleteRequest input, #MappingTarget Sample target);
SampleResponse mapToResponse(Sample input);
default String optionalToString(#Nullable Optional<String> nullable) {
return nullable == null ? null : nullable.orElse(null);
}
}
Plugin will generate boilerplate code for us. Below class is autogenerated and we do not need to implement it manually.
#Component
public class SamplesMapperImpl implements SamplesMapper {
#Override
public Sample patch(SamplePartialRequest input, Sample target) {
if ( input == null ) {
return target;
}
if ( input.getName() != null ) {
target.setName( optionalToString( input.getName() ) );
}
if ( input.getValue() != null ) {
target.setValue( optionalToString( input.getValue() ) );
}
return target;
}
#Override
public Sample update(SampleCompleteRequest input, Sample target) {
if ( input == null ) {
return target;
}
target.setName( input.getName() );
target.setValue( input.getValue() );
return target;
}
#Override
public SampleResponse mapToResponse(Sample input) {
if ( input == null ) {
return null;
}
SampleResponse sampleResponse = new SampleResponse();
sampleResponse.setId( input.getId() );
sampleResponse.setName( input.getName() );
sampleResponse.setValue( input.getValue() );
return sampleResponse;
}
}
Resource
A controller class is easy to implement:
import com.example.demo.model.SampleCompleteRequest;
import com.example.demo.model.SamplePartialRequest;
import com.example.demo.model.SampleResponse;
import com.example.service.SamplesMapper;
import com.example.service.SamplesService;
import jakarta.validation.Valid;
import lombok.AllArgsConstructor;
import org.springframework.hateoas.CollectionModel;
import org.springframework.hateoas.EntityModel;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.PatchMapping;
import org.springframework.web.bind.annotation.PathVariable;
import org.springframework.web.bind.annotation.PostMapping;
import org.springframework.web.bind.annotation.PutMapping;
import org.springframework.web.bind.annotation.RequestBody;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RestController;
import java.util.List;
#AllArgsConstructor
#RestController
#RequestMapping(value = "/api/v1/samples")
public class SamplesResource {
private final SamplesMapper mapper;
private final SamplesService samplesService;
#GetMapping
public CollectionModel<SampleResponse> listAll() {
List<SampleResponse> entities = samplesService.list().stream().map(mapper::mapToResponse).toList();
return CollectionModel.of(entities);
}
#PostMapping
public EntityModel<SampleResponse> addSample(#Valid #RequestBody SampleCompleteRequest request) {
var entity = samplesService.create(request);
var response = mapper.mapToResponse(entity);
return EntityModel.of(response);
}
#PutMapping(path = "{id}")
public EntityModel<SampleResponse> updateSample(#PathVariable Long id, #Valid #RequestBody SampleCompleteRequest request) {
var entity = samplesService.update(id, request);
var response = mapper.mapToResponse(entity);
return EntityModel.of(response);
}
#PatchMapping(path = "{id}")
public EntityModel<SampleResponse> partiallyUpdateSample(#PathVariable Long id, #Valid #RequestBody SamplePartialRequest request) {
var entity = samplesService.patch(id, request);
var response = mapper.mapToResponse(entity);
return EntityModel.of(response);
}
}
A service class is also straightforward:
import com.example.demo.model.SampleCompleteRequest;
import com.example.demo.model.SamplePartialRequest;
import lombok.AllArgsConstructor;
import org.springframework.stereotype.Service;
import java.util.List;
#Service
#AllArgsConstructor
public class SamplesService {
private final SamplesMapper mapper;
private final SamplesRepository repository;
public List<Sample> list() {
return repository.listAll();
}
public Sample create(SampleCompleteRequest request) {
var sample = mapper.update(request, new Sample());
return repository.save(sample);
}
public Sample update(Long id, SampleCompleteRequest request) {
var sample = repository.find(id).orElseThrow();
mapper.update(request, sample);
return repository.save(sample);
}
public Sample patch(Long id, SamplePartialRequest request) {
var sample = repository.find(id).orElseThrow();
mapper.patch(request, sample);
return repository.save(sample);
}
}
See also:
HTTP PUT vs HTTP PATCH in a REST API
Difference between Jackson objectMapper to others
Spring MVC PATCH method: partial updates
I'm using SpringBoot to communicate with MongoDB. I've made 2 models the first one is Events:
package backend.blabla.model;
import org.springframework.data.mongodb.core.mapping.Document;
import lombok.Data;
#Document
#Data
public class Events {
private String name;
private String ProjectId;
public Events(
String name,
String ProjectId) {
this.name= name;
this.ProjectId = ProjectId;
}
}
and the second one is EventList which contains an arrayList of Events:
package backend.blabla.model;
import java.util.ArrayList;
import org.springframework.data.annotation.Id;
import org.springframework.data.mongodb.core.mapping.Document;
import lombok.Data;
#Document
#Data
public class EventList {
#Id
private String id;
private String name;
private String projectId ;
public ArrayList<Events> events;
public EventList(String name,String projectId,ArrayList<Events> events){
this.name= name;
this.projectId = projectId;
this.events= new ArrayList<Events>();
}
}
so I can obtain the following structure
[
{
"id"="62a10a455238454587911d9d",
"name":"xxxxx",
"projectId":"62998dadb89ca15c88d6880a",
"events":[
{
"name":"to do",
"ProjectId":"62998dadb89ca15c88d6880a"
},
{
"name":"done",
"ProjectId":"62998dadb89ca15c88d6880a"
}
]
}
]
And I have this method in the controller to add a new event in the arrayList events for a certain EventList
#PutMapping("/{id}")
EventList addEvent(#PathVariable String id,#RequestBody Events event){
EventList eventListFromDB = eventListRepo.findById(id).orElseThrow(RuntimeException::new);
Events newEvent = new Events(event.getName(), event.getProjectId());
ArrayList<Events> eventsFromDB = eventListFromDB.getEvents();
eventsFromDB.add(newEvent);
eventListFromDB.setEvents(eventsFromDB);
return eventListRepo.save(eventListFromDB);
}
The problem that I'm facing is when I try to add a new event the old one gets removed and when I made
System.out.println(eventListFromDB)
before the
eventsFromDB.add(newEvent);
it gives me
EventList(id=62a10a455238454587911d9d, name=xxxxx, projectId=62998dadb89ca15c88d6880a, events=[])
and after it it gives me
EventList(id=62a10a455238454587911d9d, name=xxxxx, projectId=62998dadb89ca15c88d6880a, events=[Events(name=to do,ProjectId=62998dadb89ca15c88d)])
First question is how to extract data from the arrayList
and second how can I add to the arrayList without removing the first arrayList's element
which I believe if the first question get answered the second automatically will too
and thanks in advance
I created Entity class:
package order;
import javax.persistence.*;
import java.time.LocalDate;
import java.util.UUID;
#Entity
#Table(name = "bo_order")
public class Order {
#Id
#GeneratedValue(strategy= GenerationType.IDENTITY)
private Long id;
private String login;
private Long internalNumber;
private String food;
#Column(name="data_ins")
private LocalDate dateOfOrder;
private String orderNumber;
public Order(){}
public Order(String login)
{
this.login = login;
}
public Order(String login, Long internalNumber, String food, LocalDate dateOfOrder) {
this.login = login;
this.internalNumber = internalNumber;
this.food = food;
this.dateOfOrder = dateOfOrder;
}
public void setOrderNumber(String orderNumber) {
this.orderNumber = orderNumber;
}
public String generateOrderNumber()
{
return UUID.randomUUID().toString();
}
#Override
public String toString() {
return "Order{" +
"id=" + id +
", login='" + login + '\'' +
", internalNumber=" + internalNumber +
", food='" + food + '\'' +
", dateOfOrder=" + dateOfOrder +
'}';
}
}
added Repository to this:
package order;
import org.springframework.data.jpa.repository.Query;
import org.springframework.data.repository.CrudRepository;
import java.util.List;
public interface OrderRepository extends CrudRepository<Order, Long>{
List<Order> findByLogin(String login);
#Query(value="SELECT id, login FROM bo_order", nativeQuery = true)
public List<Order> findAllPurchasers();
}
and try to display values from #Query in Grid:
package purchasers;
import com.vaadin.annotations.Title;
import com.vaadin.server.VaadinRequest;
import com.vaadin.spring.annotation.SpringUI;
import com.vaadin.ui.*;
import order.Order;
import order.OrderRepository;
import org.springframework.beans.factory.annotation.Autowired;
#SpringUI(path = "/allpurchasers")
#Title("All today's purchasers")
public class AllPurchasersGUI extends UI {
#Autowired
private final OrderRepository orderRepository;
final Grid<Order> grid;
public AllPurchasersGUI(OrderRepository pr) {
this.orderRepository = pr;
grid = new Grid<>(Order.class);
grid.setSizeFull();
}
#Override
protected void init(VaadinRequest vaadinRequest) {
setContent(grid);
listPurchasers();
}
private void listPurchasers()
{
grid.setItems(orderRepository.findAllPurchasers());
}
}
but I got an error org.springframework.dao.InvalidDataAccessResourceUsageException: could not execute query; SQL [SELECT id, login FROM bo_order]; nested exception is org.hibernate.exception.SQLGrammarException: could not execute query and then
Caused by: org.postgresql.util.PSQLException: The column name data_ins was not found in this ResultSet.
I know data_ins is not in ResultSet because I don't want it there. I can display all values in Grid from bo_order with findAll but I want just id and login. How can I achive this? I also tried to make List<Object> instead of List<Order> but then I got problem with displaying then.
TL;DR: Add grid.setColumns("id", "login") to the end of the listPurchasers() method or include data_ins in the list of database columns in the #Query annotation.
When you create a Grid with the Grid(Class<T> beanType) constructor, one column will be added for every getter in the bean type. This will lead to an error unless data for all those getters are also loaded from the database.
In this case, the #Query annotation defines that data should only be fetched from the database for the columns id and login. The exception you get refers to the database column data_ins, which seems to be mapped to the dateOfOrder bean property, but I don't see any code that would remove the automatically added column for that property.
The easiest way of ensuring only desired columns are used by your grid is to use the setColumns method. That method works like setColumnOrder, except that it also removes any column that isn't included as a parameter.
Alternatively, you can also change your #Query annotation to also include the data_ins property from the database if you actually want that data to be shown as a column in the grid.
Grid takes parameters from Order class, so if there were many getters then they were displayed in columns. I did:
private void listPurchasers()
{
List<Order> allOrders = (List<Order>) orderRepository.findAll();
grid.setItems(allOrders);
grid.setColumnOrder("id", "login", "dateOfOrder");
grid.removeColumn("food");
}
so it seems to be very easy to add/remove columns to display.
I have a Spring Batch job set up to read in a CSV.
In the reader it creates ProductCSV objects which represent each row using FlatFileReader.
In the writer it then converts each row into an actual Object object which is mapped using hibernate into a database using an extended ItemWriter.
Works great the only problem I have is ENUM typed fields. The error I get is:
Field error in object 'target' on field 'category': rejected value [Some Category]; codes [typeMismatch.target.category,typeMismatch.category,typeMismatch.com.project.enums.ProductCategory,typeMismatch]; arguments [org.springframework.context.support.DefaultMessageSourceResolvable: codes [target.category,category]; arguments []; default message [category]]; default message [Failed to convert property value of type 'java.lang.String' to required type 'com.project.enums.ProductCategory' for property 'category'; nested exception is java.lang.IllegalStateException: Cannot convert value of type [java.lang.String] to required type [com.project.ProductCategory] for property 'category': no matching editors or conversion strategy found]
Here is what the ENUM looks like:
package com.project.enums;
public enum ProductCategory
{
SomeCategory( "Some Category" ),
AnotherCategory( "Another Category" );
final String display;
private ProductCategory( String display )
{
this.display = display;
}
#Override
public String toString()
{
return display;
}
}
Here is what the ProductCSV object looks like:
package com.project.LoadSavingInfo;
import com.project.enums.ProductCategory;
public class ProductCSV
{
private ProductCategory category;
public ProductCategory getCategory()
{
return this.category;
}
public void setCategory( ProductCategory category )
{
this.category = category;
}
}
Here is what the actual object looks like:
package com.project;
import javax.persistence.Column;
import javax.persistence.Entity;
import javax.persistence.EnumType;
import javax.persistence.Enumerated;
import javax.persistence.Table;
import com.project.enums.ProductCategory;
#Entity
#Table( name = "product" )
public class Product
{
#Column( nullable = false )
#Enumerated(EnumType.STRING)
private ProductCategory category;
public ProductCategory getCategory()
{
return category;
}
public void setCategory( ProductCategory category )
{
this.category = category;
}
}
So when it reads in something like "Some Category" from the CSV, how do I convert this into the ENUM type? Any help or advice is greatly appreciated and if you need any more info please just ask.
The problem is that standard Spring text->enum conversion is done using enum's name (SomeCategory,AnotherCategory) and not his displayName.
My advice is to convert enum's display name to ProductCategory object in your ItemProcessor as:
class MyItemProcessor<ProductCSV,Product> {
public Product process(ProductCSV item) {
Product p = new Product();
p.setCategory(ProductCategory.fromDisplayName(item.getCategory());
}
}
As side effect you have to declare
public class ProductCSV {
private String category;
public String getCategory() {
return this.category;
}
public void setCategory( String category ) {
this.category = category;
}
}
You have the full process in your hand (and this is my preferred way, is cleaner).
Another solution is to use your current classes and write a custom enum property editor/converted as described in Spring custom converter for all Enums.