Closed. This question needs debugging details. It is not currently accepting answers.
Edit the question to include desired behavior, a specific problem or error, and the shortest code necessary to reproduce the problem. This will help others answer the question.
Closed 7 days ago.
Improve this question
There is regex pattern
#Pattern(regexp ="^(?=.*[a-z])(?=.*[A-Z])(?=.*[0-9]).{8,128}$");
which ACCORDING TO condition:
at least 8 characters;
no more than 128 characters;
at least one uppercase and one lowercase letter;
only Latin letters;
at least one digit;
Arabic numerals only;
I need to expression according to:
at least 8 characters;
no more than 128 characters;
at least one uppercase and one lowercase letter;
only Latin letters;
at least one digit;
Arabic numerals only;
without spaces.
How can I do it?
This is my Entity:
#Data
#Builder
#NoArgsConstructor
#AllArgsConstructor
public class ApplicantDto implements Serializable {
private Long id;
#Pattern(regexp = "^(?=.*[a-z])(?=.*[A-Z])(?=.*[0-9])[A-Za-z0-9]{8,128}$", message = """)
private String password;
This is my repository:
#Repository
public interface ApplicantRepository extends JpaRepository<Applicant, Long> {
Optional<Applicant> findByEmail(final String Login);
}
This is my controller:
#RestController
#CrossOrigin
#Slf4j
#RequestMapping("/api/v1/applicants")
#RequiredArgsConstructor
public class ApplicantController {
private final ApplicantRepository applicantRepository;
#PostMapping(consumes = "application/json", produces = "application/json")
public ResponseEntity<?> registration(#Valid #RequestBody final ApplicantDto applicantDto) {
log.info("Method registration() with user {} in class {} started",
applicantDto, getClass().getName());
return new ResponseEntity<>(applicantRepository.save(applicantDto), HttpStatus.CREATED);
}
I need it to give an error when I send a request :
{
"password": "Eas yHe1p",
}
I tried to do so, but without success:
#Pattern(regexp ="^(?=.*[a-z])(?=.*[A-Z])(?=.*[0-9])(.[^\s]){8,128}$");
#Pattern(regexp ="^(?=.*[a-z])(?=.*[A-Z])(?=.*[0-9])[A-Za-z0-9]{8,128}$");
Sorry guys, after that how I rebooted my computer It really works:
#Pattern(regexp ="^(?=.*[a-z])(?=.*[A-Z])(?=.*[0-9])[A-Za-z0-9]{8,128}$");
Related
I'm totally new to reactive programming and I have problem with coding even such an elementary task. The following method of an RestController should:
Take as parameter DoiReservationRequest object that represents reservation of yet-to-be-published DOI number (https://www.doi.org/). This reservation is meaningful only within our internal systems. The parameter is passed in the body of the POST request. DOI reservation request is a simple object:
public record DoiReservationRequest(String doi) {
}
Check that there is no previous reservation of the same number, or that the DOI number is not actually already submitted and published. For this purpose, try to find submissions with the same DOI in DoiSubmissionRepository, which is defined as:
#EnableMongoRepositories
#Repository
public interface DoiSubmissionRepository extends ReactiveMongoRepository<DoiSubmission, String> {
Flux<DoiSubmission> findAllByDoi(Publisher<String> doi);
}
DoiSubmission is itself defined as:
#Getter
#NoArgsConstructor(access = AccessLevel.PROTECTED)
#AllArgsConstructor
#ToString
#Document
public final class DoiSubmission {
#Id
private String id;
#Indexed
private String doi;
private Integer version;
private String xml;
private Date timestamp;
}
If no submission exists then return HTTP 201 with body that for now is empty, but before that save the reservation as DOI submission that has version 0 and empty xml content.
If submissions with the same doi exist (several different versions of the same DOI number with different xml data), return HTTP 409 with body that is yet to be determined that describes the error.
The code hangs indefinitely when POST request is made:
#PostMapping("/api/v1/reservation/")
public Mono<ResponseEntity<String>> create(#RequestBody Publisher<DoiReservationRequest> doi) {
return doiSubmissionRepository
.findAllByDoi(Mono.from(doi)
.map(DoiReservationRequest::doi))
.hasElements()
.flatMap(hasElements->{
if (hasElements) {
return Mono.just(ResponseEntity.status(HttpStatus.CONFLICT).body(""));
} else {
return Mono.from(doi)
.map(doiReservationRequest -> new DoiSubmission(
UUID.randomUUID().toString(),
doiReservationRequest.doi(), 0, "", new Date()))
.flatMap(doiSubmissionRepository::save)
.then(Mono.just(ResponseEntity.status(HttpStatus.OK).body("")));
}
});
}
This question already has an answer here:
Could not resolve pointer: /definitions/Error-ModelName
(1 answer)
Closed 1 year ago.
I have a Spring controller method as follows.
#PutMapping("/update")
public ResponseEntity<String> updateMethod(#RequestBody() ListDto listDto) {
...
}
The input parameter should be an instance of the ListDto class. The ListDto class is as follows.
#Data
#ApiModel(description = "update list dto")
public class ListDto extends ArrayList<ObjectDto> {
}
The ObjectDto class is as follows.
#Data
#ApiModel(description = "update object dto")
public class ObjectDto {
#ApiModelProperty(example = "1")
private String id;
#ApiModelProperty(example = "new message")
private String message;
}
The issue is, when trying to use the method in Swagger, I get the below error - seems like the swagger definition for the ObjectDto class does not get created at runtime.
Is there a way to force the definition to get created and make this error disappear?
Try and reinstall your schema.
To do so You can Simply reinstall swagger!
I have a working setup for Spring Cloud Kafka Streams with functional programming style.
There are two use cases, which are configured via application.properties.
Both of them work individually, but as soon as I activate both at the same time, I get a serialization error for the output stream of the second use case:
Exception in thread "ActivitiesAppId-05296224-5ea1-412a-aee4-1165870b5c75-StreamThread-1" org.apache.kafka.streams.errors.StreamsException:
Error encountered sending record to topic outputActivities for task 0_0 due to:
...
Caused by: org.apache.kafka.common.errors.SerializationException:
Can't serialize data [com.example.connector.model.Activity#497b37ff] for topic [outputActivities]
Caused by: com.fasterxml.jackson.databind.exc.InvalidDefinitionException:
Incompatible types: declared root type ([simple type, class com.example.connector.model.Material]) vs com.example.connector.model.Activity
The last line here is important, as the "declared root type" is from the Material class, but not the Activity class, which is probably the source error.
Again, when I only activate the second use case before starting the application, everything works fine. So I assume that the "Material" processor somehow interfers with the "Activities" processor (or its serializer), but I don't know when and where.
Setup
1.) use case: "Materials"
one input stream -> transformation -> one output stream
#Bean
public Function<KStream<String, MaterialRaw>, KStream<String, Material>> processMaterials() {...}
application.properties
spring.cloud.stream.kafka.streams.binder.functions.processMaterials.applicationId=MaterialsAppId
spring.cloud.stream.bindings.processMaterials-in-0.destination=inputMaterialsRaw
spring.cloud.stream.bindings.processMaterials-out-0.destination=outputMaterials
2.) use case: "Activities"
two input streams -> joining -> one output stream
#Bean
public BiFunction<KStream<String, ActivityRaw>, KStream<String, Assignee>, KStream<String, Activity>> processActivities() {...}
application.properties
spring.cloud.stream.kafka.streams.binder.functions.processActivities.applicationId=ActivitiesAppId
spring.cloud.stream.bindings.processActivities-in-0.destination=inputActivitiesRaw
spring.cloud.stream.bindings.processActivities-in-1.destination=inputAssignees
spring.cloud.stream.bindings.processActivities-out-0.destination=outputActivities
The two processors are also defined as stream function in application.properties: spring.cloud.stream.function.definition=processActivities;processMaterials
Thanks!
Update - Here's how I use the processors in the code:
Implementation
// Material model
#Getter
#Setter
#AllArgsConstructor
#NoArgsConstructor
public class MaterialRaw {
private String id;
private String name;
}
#Getter
#Setter
#AllArgsConstructor
#NoArgsConstructor
public class Material {
private String id;
private String name;
}
// Material processor
#Bean
public Function<KStream<String, MaterialRaw>, KStream<String, Material>> processMaterials() {
return materialsRawStream -> materialsRawStream .map((recordKey, materialRaw) -> {
// some transformation
final var newId = materialRaw.getId() + "---foo";
final var newName = materialRaw.getName() + "---bar";
final var material = new Material(newId, newName);
// output
return new KeyValue<>(recordKey, material);
};
}
// Activity model
#Getter
#Setter
#AllArgsConstructor
#NoArgsConstructor
public class ActivityRaw {
private String id;
private String name;
}
#Getter
#Setter
#AllArgsConstructor
#NoArgsConstructor
public class Assignee {
private String id;
private String assignedAt;
}
/**
* Combination of `ActivityRaw` and `Assignee`
*/
#Getter
#Setter
#AllArgsConstructor
#NoArgsConstructor
public class Activity {
private String id;
private Integer number;
private String assignedAt;
}
// Activity processor
#Bean
public BiFunction<KStream<String, ActivityRaw>, KStream<String, Assignee>, KStream<String, Activity>> processActivities() {
return (activitiesRawStream, assigneesStream) -> {
final var joinWindow = JoinWindows.of(Duration.ofDays(30));
final var streamJoined = StreamJoined.with(
Serdes.String(),
new JsonSerde<>(ActivityRaw.class),
new JsonSerde<>(Assignee.class)
);
final var joinedStream = activitiesRawStream.leftJoin(
assigneesStream,
new ActivityJoiner(),
joinWindow,
streamJoined
);
final var mappedStream = joinedStream.map((recordKey, activity) -> {
return new KeyValue<>(recordKey, activity);
});
return mappedStream;
};
}
This turns out to be an issue with the way the binder infers Serde types when there are multiple functions with different outbound target types, one with Activity and another with Material in your case. We will have to address this in the binder. I created an issue here.
In the meantime, you can follow this workaround.
Create a custom Serde class as below.
public class ActivitySerde extends JsonSerde<Activity> {}
Then, explicitly use this Serde for the outbound of your processActivities function using configuration.
For e.g.,
spring.cloud.stream.kafka.streams.bindings.processActivities-out-0.producer.valueSerde=com.example.so65003575.ActivitySerde
Please change the package to the appropriate one if you are trying this workaround.
Here is another recommended approach. If you define a bean of type Serde with the target type, that takes precedence as the binder will do a match against the KStream type. Therefore, you can also do it without defining that extra class in the above workaround.
#Bean
public Serde<Activity> activitySerde() {
return new JsonSerde(Activity.class);
}
Here are the docs where it explains all these details.
You need to specify which binder to use for each function s.c.s.bindings.xxx.binder=....
However, without that, I would have expected an error such as "multiple binders found but no default specified", which is what happens with message channel binders.
I tried get entity by Data JPA & Data Rest without HATEOAS.
The condition is that I use the HATEOAS form, and sometimes I need a pure Json response.
So I'm creating JSON by creating the controller path separately from the repository's endpoint and creating the DTO class separately.
this is my code :
#RepositoryRestController
public class MetricController {
#Autowired
private MetricRepository metricRepository;
#RequestMapping(method = RequestMethod.GET, value = "/metrics/in/{id}")
public #ResponseBody
MetricDTO getMetric(#PathVariable Long id) {
return MetricDTO.fromEntity(metricRepository.getOne(id));
}
}
#RepositoryRestResource
public interface MetricRepository extends JpaRepository<Metric, Long> { }
#Setter
#Getter
#NoArgsConstructor
#AllArgsConstructor
public class MetricDTO {
private SourceType sourceType;
private String metricTypeField;
private String metricType;
private String instanceType;
private String instanceTypeField;
private List<String> metricIdFields;
private List<String> valueFields;
private Map<String, String> virtualFieldValueEx;
public static MetricDTO fromEntity(Metric metric) {
return new MetricDTO(
metric.getSourceType(),
metric.getMetricTypeField(),
metric.getMetricType(),
metric.getInstanceType(),
metric.getInstanceTypeField(),
metric.getMetricIdFields(),
metric.getValueFields(),
metric.getVirtualFieldValueEx()
);
}
}
It's the way I do, but I expect there will be better options and patterns.
The question is, I wonder if this is the best way.
HATEOAS (Hypermedia as the Engine of Application State) is a constraint of the REST application architecture.
It basically tells that anyone who is a consumer of your REST endpoints can navigate between them with the help of the link.
let take your example
**HTTP Method** **Relation (rel)** **Link**
GET Up /metrics/in
GET Self /metrics/in/{id}
GET SourceType /sourceType/{id}
GET metricIdFields /url for each in JSON aarray
Delete Delete /employe/{employeId}
Use org.springframework.hateoas.Links class to create such link in your DTOs.
in you DTO add
public class MetricDTO {
private Links links;
//Getters and setters
//inside your setters add SLEF , GET , create Delete for current resource
}
https://www.baeldung.com/spring-hateoas-tutorial
This question already has answers here:
Random documents from MongoDB using spring-data
(2 answers)
Closed 5 years ago.
Lets assume the following structure:
A user class:
public class User {
#Id
String id;
String name;
//...
}
The users repository:
public interface UserRepository extends MongoRepository<User, String> {
List<User> findByRandom(); // this method signature does not exist but would do what I intend to do
}
A user controller:
#Component
public class UserController {
private UserRepository users;
#Autowired
public UserController(
UserRepository users) {
this.users= users;
}
public List<User> getRandomUsers() {
return(users.findByRandom()); // limit is missing here
}
}
How would one achieve to receive random documents out of a structure like this.
Having a field with a random value on the document would not be a desired solution, since the values should always be random (e.g. if I hit the random int value 4 and receive the x following items, those would always be the same). Having to query x times is also not prefered, since this would be too heavy load.
Can anyone help?
Thanks in advance,
Codehai
Just use the $sample stage:
Via Spring-Data (from v2.0 onwards):
SampleOperation matchStage = Aggregation.sample(5);
Aggregation aggregation = Aggregation.newAggregation(sampleStage);
AggregationResults<OutType> output = mongoTemplate.aggregate(aggregation, "collectionName", OutType.class);
Directly through the Java driver:
import static com.mongodb.client.model.Aggregates.*;
users.aggregate(Arrays.asList(sample(5)));