I'm trying to build a HAL-compliant REST API with Spring HATEOAS.
After some fiddling I managed to get to work mostly like expected.
The (sample) output looks like this right now:
{
"_links": {
"self": {
"href": "http://localhost:8080/sybil/configuration/bricks"
}
},
"_embedded": {
"brickDomainList": [
{
"hostname": "localhost",
"port": 4223,
"_links": {
"self": {
"href": "http://localhost:8080/sybil/configuration/bricks/localhost"
}
}
},
{
"hostname": "synerforge001",
"port": 4223,
"_links": {
"self": {
"href": "http://localhost:8080/sybil/configuration/bricks/synerforge001"
}
}
}
]
}
}
I don't like the "brickDomainList" array's name. It should say "bricks", ideally. How can I change it?
Here's the controller that produces the output:
#RestController
#RequestMapping("/configuration/bricks")
public class ConfigurationBricksController {
private BrickRepository brickRepository;
private GraphDatabaseService graphDatabaseService;
#Autowired
public ConfigurationBricksController(BrickRepository brickRepository, GraphDatabaseService graphDatabaseService) {
this.brickRepository = brickRepository;
this.graphDatabaseService = graphDatabaseService;
}
#ResponseBody
#RequestMapping(method = RequestMethod.GET, produces = "application/hal+json")
public Resources<BrickResource> bricks() {
List<BrickDomain> bricks;
List<BrickResource> resources = new ArrayList<>();
List<Link> links = new ArrayList<>();
Link self = linkTo(ConfigurationBricksController.class).withSelfRel();
links.add(self);
try(Transaction tx = graphDatabaseService.beginTx()) { // begin transaction
// get all Bricks from database and cast them into a list so that they're actually fetched
bricks = new ArrayList<>(IteratorUtil.asCollection(brickRepository.findAll()));
// end transaction
tx.success();
}
for (BrickDomain brick : bricks) {
self = linkTo(methodOn(ConfigurationBricksController.class).brick(brick.getHostname())).withSelfRel();
BrickResource resource = new BrickResource(brick, self);
resources.add(resource);
}
return new Resources<>(resources, links);
}
}
Is there some Annotation or something I can add to change the array's name?
If you want/need to look at the BrickResource class or the Repositories or something look here: https://github.com/ttheuer/sybil/tree/mvctest/src/main/java/org/synyx/sybil
The BrickResource is in api/resources/, the repository is in database/, and the BrickDomain in domain/.
Thanks!
Just use Evo Inflector. If you have a Maven project then add the dependency
<dependency>
<groupId>org.atteo</groupId>
<artifactId>evo-inflector</artifactId>
<version>1.2</version>
</dependency>
Or you can add #Relation(collectionRelation = "bricks") to the BrickDomain class
#Relation(collectionRelation = "bricks")
public class BrickDomain { … }
Related
I'm reading spring in action 5th and learning spring-cloud, hateoas and webflux.I tried to write a rest controller as following
import static org.springframework.hateoas.server.reactive.WebFluxLinkBuilder.*;
#RestController
#RequestMapping(path = "/")
public class ServiceController {
private IngredientServiceClient ingredientClient;
#Autowired
public ServiceController(IngredientServiceClient ingredientClient) {
this.ingredientClient = ingredientClient;
}
#GetMapping("/ingredients/{id}")
public Mono<EntityModel<Ingredient>> getIngredientById(#PathVariable("id") String id) {
return ingredientClient.getIngredientById(id)
.flatMap(ingredient -> {
EntityModel<Ingredient> model = EntityModel.of(ingredient);
Mono<Link> link = linkTo(methodOn(ServiceController.class).getIngredientById(id)).withSelfRel().toMono();
return link.map(lk -> model.add(lk));
});
}
}
IngredientServiceClient.getIngredientById
public Mono<Ingredient> getIngredientById(String id) {
return wcBuilder.build()
.get().uri("http://ingredient-api/ingredients/{id}", id)
.retrieve().bodyToMono(Ingredient.class);
}
When I access to localhost:8082/ingredients/FLTO a node of my webapp, it shows me only the relative path like this
{
"id": "FLTO",
"name": "Flour Tortilla",
"type": "WRAP",
"_links": {
"self": {
"href": "/ingredients/FLTO"
}
}
}
I've tried the WebMvcLinkBuilder but it still did not work correctly. I found some explanations about my problem. But I'm not sure whether the context/exchange was null (and why). Could you help me?
try to set spring.main.web-application-type=reactive
I need to do a "join" between two different tables, my query is this:
db.Summary.aggregate([
{
$lookup: {
from: "OriginFilter",
localField: "origin",
foreignField: "code",
as: "originObject"
}
},
{
$project: {
"hub": 1,
"moment": 1,
"origin": { $arrayElemAt: [ "$originObject.displayName", 0 ] },
"_id": 0
}
}
]);
And work perfectly, but I have no idea how I can do it in code using java with spring.
I have the LookupAggregationOperation class:
public class LookupAggregationOperation implements AggregationOperation {
private final DBObject operation;
public LookupAggregationOperation(String from, String localField,
String foreignField, String as) {
this.operation = new BasicDBObject("$lookup",
new BasicDBObject("from", from)
.append("localField", localField)
.append("foreignField", foreignField)
.append("as", as));
}
#Override
public DBObject toDBObject(AggregationOperationContext context) {
return context.getMappedObject(operation);
}
and I'm using it this way:
LookupAggregationOperation getOriginDisplayName = new LookupAggregationOperation("OriginFilter","origin","code","originObject");
then, I have a groupOperation:
GroupOperation groupByDay = group("moment.year", "moment.month", "moment.day", "moment.hour", "countType", "origin", "originObject")
.sum("$count").as("count");
and finally I have a ProjectOperation:
ProjectionOperation renameFields = project("count", "countType")
.and("_id.originObject.displayName").as("originDisplayName").and("_id.year").as("moment.year")
.and("_id.month").as("moment.month")
.and("_id.day").as("moment.day")
.and("_id.hour").as("moment.hour");
but and("_id.originObject.displayName").as("originDisplayName") isnt working. How can I do the query above?
Is there any way to write Json array list in application.properties file (spring-boot), so that I can read whole Json with minimum code?
Below is my json object.
{ "userdetail": [
{
"user": "user1",
"password": "password2",
"email": "email1"
},
{
"user": "user2",
"password": "password2",
"email": "email2"
}]}
And also if I use jasypt to encode password, then what the code will be?
Refer to Spring Boot Docs here
You can use YAML
my:
servers:
- dev.example.com
- another.example.com
or normal configuration property arrays:
my.servers[0]=dev.example.com
my.servers[1]=another.example.com
Than you can load these properties into application this way:
#ConfigurationProperties(prefix="my")
public class Config {
private List<String> servers = new ArrayList<String>();
public List<String> getServers() {
return this.servers;
}
}
For your case I would try something like:
users.userdetail[0].user=..
users.userdetail[0].password=..
...
And read it via
#ConfigurationProperties(prefix="users")
public class Userdetail {
private class User {
public String user;
public String password;
public String email;
}
private List<User> userdetails = new ArrayList<User>();
public List<User> getUserdetails() {
return this.userdetails;
}
}
I want to update my DynamoDB through java using dynamoDBMapper Library.
What I did is to push messages(updates I want to executed) to one SQS, and let my java code to consume these messages and update my dynamoDB.
I found that when I push more than 150 messages in a short using a script, all the data can be consumed but only parts of the record in DynamoDB was updated.
The code to update DynamoDB like this:
#Service
public class PersistenceMessageProcessingServiceImpl implements PersistenceMessageProcessingService{
#Override
public void process(TextMessage textMessage){
String eventData = textMessage.getText();
updateEventStatus(eventData);
}
/*
each input is a caseDetail messages in Event Table
get data, parse data and update relative records partially in dynamodb.
finally check if still any open cases, if not change state of event
*/
private void updateEventStatus(String eventData) throws ParseException, IOException {
RetryUtils retryUtils = new RetryUtils(maxRetries, waitTimeInMilliSeconds, influxService);
SNowResponse serviceNowResponse = parseData(eventData);
EventCaseMap eventCaseMap = eventCaseMapRepository.findBySysId(sysId);
if (eventCaseMap != null) {
Event event = eventRepository.findByEventId(eventCaseMap.getSecurityEventManagerId());
CaseManagementDetails caseManagementDetails = event.getCaseManagementDetails();
Case existingCaseDetails = getCaseByCaseSystemId(caseManagementDetails, sysId);
caseDetails.setCaseStatus('Resolved');
caseDetails.setResolution(serviceNowResponse.getCloseCode());
caseDetails.setResolvedBy("A");
caseDetails.setAssessment(serviceNowResponse.getAssessment());
caseDetails.setResolutionSource("SEM");
retryUtils.run(() -> {
return eventRepository.updateEvent(event); }, RETRY_MEASUREMENT);
}
boolean stillOpen = false;
for(Case existingCase : caseManagementDetails.getCases()){
if(("OPEN").equals(existingCase.getCaseStatus().toString())){
stillOpen = true;
break;
}
}
if(!stillOpen){
event.setState('CLOSED');
}
}
private Case getCaseByCaseSystemId(CaseManagementDetails caseManagementDetails, String sysId) {
Case caseDetails = null;
if (caseManagementDetails != null) {
List<Case> caseList = caseManagementDetails.getCases();
for (Case c : caseList) {
if (c.getCaseSystemId() != null && c.getCaseSystemId().equalsIgnoreCase(sysId)) {
caseDetails = c;
break;
}
}
}
return caseDetails;
}
}
/* EventCaseMap Table in my DynamoDB
data model is like this for EventCaseMap Table:
{
"caseSystemId": "bb9cc488dbf67b40b3d57709af9619f8",
"securityEventManagerId": "756813a4-4e48-4abb-b37e-da00e931583b"
}
*/
#Repository
public class EventCaseMapRepositoryImpl implements EventCaseMapRepository {
#Autowired
DynamoDBMapper dynamoDBMapper;
#Override
public EventCaseMap findBySysId(String sysId) {
EventCaseMap eventCaseMap = new EventCaseMap();
eventCaseMap.setCaseSystemId(sysId);
return dynamoDBMapper.load(eventCaseMap, DynamoDBMapperConfig.ConsistentReads.CONSISTENT.config());
}
}
/*
data model is like this for Event Table:
{
"caseManagementDetails": {
"cases": [
{
"caseId": "SIR0123456",
"caseStatus": "OPEN",
},
{
"caseId": "SIR0654321",
"caseStatus": "OPEN",
},
{
many other cases(about two hundreds).....
}
]
},
"state": "OPEN",
"securityEventManagerId": "756813a4-4e48-4abb-b37e-da00e931583b"
}
*/
#Repository
public class EventRepositoryImpl implements EventRepository {
#Autowired
DynamoDBMapper dynamoDBMapper;
#Override
public Event findByEventId(String eventId) {
Event event = new Event();
event.setSecurityEventManagerId(eventId);
return dynamoDBMapper.load(event, DynamoDBMapperConfig.ConsistentReads.CONSISTENT.config());
}
#Override
public boolean updateEvent(Event event) {
dynamoDBMapper.save(event, DynamoDBMapperConfig.SaveBehavior.UPDATE_SKIP_NULL_ATTRIBUTES.config());
return false;
}
}
I already try to push the message and consume the message one by one in both 'RUN' and 'DEBUG' model in my Intellij. evertything works fine, all the cases can be updated.
So I was wondering if any inconsistency problems in DynamoDB, but I have already using Strong Consistency in my code.
So do any body know what happened in my code?
There is the input, output, expected output:
input:
many json files like this:
{
"number": "SIR0123456",
"state": "Resolved",
"sys_id": "bb9cc488dbf67b40b3d57709af9619f8",
"MessageAttributes": {
"TransactionGuid": {
"Type": "String",
"Value": "093ddb36-626b-4ecc-8943-62e30ffa2e26"
}
}
}
{
"number": "SIR0654321",
"state": "Resolved",
"sys_id": "bb9cc488dbf67b40b3d57709af9619f7",
"MessageAttributes": {
"TransactionGuid": {
"Type": "String",
"Value": "093ddb36-626b-4ecc-8943-62e30ffa2e26"
}
}
}
output for Event Table:
{
"caseManagementDetails": {
"cases": [
{
"caseId": "SIR0123456",
"caseStatus": "RESOLVED",
},
{
"caseId": "SIR0654321",
"caseStatus": "OPEN"
},
{
many other cases(about two hundreds).....
}
]
},
"state": "OPEN",
"securityEventManagerId": "756813a4-4e48-4abb-b37e-da00e931583b"
}
Expected output for Event Table:
{
"caseManagementDetails": {
"cases": [
{
"caseId": "SIR0123456",
"caseStatus": "RESOLVED",
},
{
"caseId": "SIR0654321",
"caseStatus": "RESOLVED"
},
{
many other cases(about two hundreds).....
}
]
},
"state": "OPEN",
"securityEventManagerId": "756813a4-4e48-4abb-b37e-da00e931583b"
}
I think the problem lies in that when DynamoDB persistent data, it run in a multi-threading way. so if we consume all these data in a short time, there may some threads didn't finish. So the result we saw was just the result of the last thread not that of all threads.
With the following setup, spring doesn't pick a transformer defined in JSON file which is under mappings directory, but it does when I declare it directly in test code.
#Configuration
public class WiremockConfiguration {
#Bean
WireMockConfigurationCustomizer optionsCustomizer() {
return new WireMockConfigurationCustomizer() {
#Override
public void customize(WireMockConfiguration options) {
options.extensions(BodyDefinitionTransformer.class);
}
};
}
}
{
"request": {
"method": "POST",
"urlPattern": "/some/thing"
},
"response": {
"status": 200,
"bodyFileName": "my_payload.json",
"transformers": [
"body-transformer"
],
"headers": {
"Content-Type": "application/json"
}
}
}
public class BodyDefinitionTransformer extends ResponseDefinitionTransformer {
#Override
public ResponseDefinition transform(Request request, ResponseDefinition responseDefinition, FileSource files,
Parameters parameters) {
return responseDefinition; //checking if this work by putting breakpoint here
}
#Override
public boolean applyGlobally() {
return false;
}
#Override
public String getName() {
return "body-transformer";
}
}
#ContextConfiguration
#SpringBootTest(webEnvironment = SpringBootTest.WebEnvironment.RANDOM_PORT)
#TestPropertySource
#AutoConfigureWireMock(port = 9632)
class DummyTestClass extends Specification {
def "some dummy test" () {
when:
stubFor(post("/some/thing").willReturn(aResponse()
//.withTransformers("body-transformer") transformer work if I declare it in this way
.withTransformerParameter("test", "test"
)))
// rest of my test where execute above request
}
}
The code works perfectly when I declare it using .withTransformers("body-transformer") but when I put transformer name into transformers array in JSON file, it doesn't work. Do you have any ideas why?