I was trying to create a vaadin hilla application, trying to save a person's details. Created endpoint and the person object as well, but in generated files, the error is showing as below.
[TypeScript] Property 'call' does not exist on type 'PersonDetails'.
The application is running, pages also came up but with error.
#Endpoint
#AnonymousAllowed
public class PersonEndpoint {
private InterPersonService personService;
#Autowired
public PersonEndpoint(InterPersonService personService) {
this.personService = personService;
}
#Nonnull
public String savePerson(PersonDetails person) {
return this.personService.savePerson(person);
}
}
Vaadin hilla application to save a person's details. Expected to set the value from the textbox to the generated object and the data will be available in the endpoint save method.
Related
In a Spring Boot Web MVC REST service I want to use the operation ID and path values from SpringDoc generated OpenAPI from within the service where its generated. How can I get the OpenAPI JSON doc without going through the web endpoint?
If I understand you correctly:
You want to get OpenAPI documentation in JSON format inside code your Spring application.
I do it this way:
1.) Create a component that extends from the OpenApiResource class.
And create a getOpenApiJson method that calls getOpenApi() (creating or receiving an OpenApi model) and writeJsonValue() (serialization of OpenAPI).
#Component
public class CustomOpenApiResource extends OpenApiResource {
public CustomOpenApiResource(ObjectFactory<OpenAPIService> openAPIBuilderObjectFactory,
AbstractRequestService requestBuilder,
GenericResponseService responseBuilder,
OperationService operationParser,
Optional<List<OperationCustomizer>> operationCustomizers,
Optional<List<OpenApiCustomiser>> openApiCustomisers,
Optional<List<OpenApiMethodFilter>> methodFilters,
SpringDocConfigProperties springDocConfigProperties,
SpringDocProviders springDocProviders) {
super(openAPIBuilderObjectFactory,
requestBuilder,
responseBuilder,
operationParser,
operationCustomizers,
openApiCustomisers,
methodFilters,
springDocConfigProperties,
springDocProviders);
}
#Override
protected String getServerUrl(HttpServletRequest request, String apiDocsUrl) {
/**
* How to implement this method you can find out for example from OpenApiWebMvcResource
*/
return "";
}
public String getOpenApiJson() throws JsonProcessingException {
return writeJsonValue(getOpenApi(Locale.getDefault()));
}
}
2.) Inject CustomOpenApiResource component
#Autowired
private CustomOpenApiResource resource;
And use getOpenApiJson()
String openApiJson = resource.getOpenApiJson();
What I want: I want to use a spring #Autowired annotation in the file conventionally named "TypeRegistryConfiguration". It works perfectly well for steps file, but for some reason the dependency injection does not work in this file (there is no error/warn message even in debug level). Spring scans "com.funky.steps", which contains the steps, the context and the type registry configuration file, see example below.
Context:
package com.funky.steps.context;
#Component
public class CommonContext {
...
Type registry configuration:
package com.funky.steps.typeregistry;
public class TypeRegistryConfiguration implements TypeRegistryConfigurer {
#Autowired
private CommonContext context; // NOT INJECTED !
#Override
public Locale locale() {
return Locale.ENGLISH;
}
#Override
public void configureTypeRegistry(TypeRegistry typeRegistry) {
registerStuff(typeRegistry)
}
...
Steps:
package com.funky.steps;
public class WebServiceSteps {
#Autowired
private CommonContext context; // Correctly injected
...
Why I want it: I have steps that save variables in the context for later use. When I build an object using type registry, I want to be able to access these variables. Example:
Given I call the web service 1
And the response field "id" will be used as "$id" # id is saved in the context
When I call the web service 2: # call type registry configuration to build the request using $id (which I can not access because it is in the context and #Autowired is not working)
| id | $id |
Then ...
This is not possible in Cucumber 4.x but you are able to register parameter and data table types as part of the glue in Cucumber 5.0.0-RC1.
Instead of registering a parameter type with registry.registerParameterType you'd use #ParameterType instead. The same works for the data table types.
private final Catalog catalog;
private final Basket basket;
#ParameterType("[a-z ]+")
public Catalog catalog(String name) {
return catalogs.findCatalogByName(name);
}
#ParameterType("[a-z ]+")
public Product product(String name) {
return catalog.findProductByName(name);
}
#Given("the {catalog} catalog")
public void the_catalog(Catalog catalog){
this.catalog = catalog
}
#When("a user places the {product} in his basket")
public void a_user_place_the_product_in_his_basket(Product product){
basket.add(product);
}
Note: The method name is used as the parameter name. A parameter name can also be provided via the name property of #ParameterType.
See: https://cucumber.io/blog/announcing-cucumber-jvm-v5-0-0-rc1/
What ever you are looking for can be achieve by cucumber 5, using qaf-cucumber and cucumber 5 using qaf-cucumber by using .
below is example:
Given I call the web service 1
And the response field "id" will be used as "id"
When I call the web service 2:
| id | ${id} |
Then ...
Step implementation may look like store method implementation:
#And("the response field {sting} will be used as {string}")
public void theResponseFiFeldWillBeUsedAs(String field, String key) {
//do the need full
String value = read(field);
getBundle().setProperty(key, value);
}
##QAFTestStep("I call the web service 2:")
public void iCallWS2:(Map<String, Object> map) {
//map should be with key id and resolved value
}
Furthermore you can get benefit from web-service support library as well. It offers request call repository concept with parameter which makes web service testing easy and maintainable.
What I want is impossible for now. See :
https://github.com/cucumber/cucumber-jvm/issues/1516
I am using AWS ECS to host my application and using DynamoDB for all database operations. So I'll have same database with different table names for different environments. Such as "dev_users" (for Dev env), "test_users" (for Test env), etc.. (This is how our company uses same Dynamo account for different environments)
So I would like to change the "tableName" of the model class using the environment variable passed through "AWS ECS task definition" environment parameters.
For Example.
My Model Class is:
#DynamoDBTable(tableName = "dev_users")
public class User {
Now I need to replace the "dev" with "test" when I deploy my container in test environment. I know I can use
#Value("${DOCKER_ENV:dev}")
to access environment variables. But I'm not sure how to use variables outside the class. Is there any way that I can use the docker env variable to select my table prefix?
My Intent is to use like this:
I know this not possible like this. But is there any other way or work around for this?
Edit 1:
I am working on the Rahul's answer and facing some issues. Before writing the issues, I'll explain the process I followed.
Process:
I have created the beans in my config class (com.myapp.users.config).
As I don't have repositories, I have given my Model class package name as "basePackage" path. (Please check the image)
For 1) I have replaced the "table name over-rider bean injection" to avoid the error.
For 2) I printed the name that is passing on to this method. But it is Null. So checking all the possible ways to pass the value here.
Check the image for error:
I haven't changed anything in my user model class as beans will replace the name of the DynamoDBTable when the beans got executed. But the table name over riding is happening. Data is pulling from the table name given at the Model Class level only.
What I am missing here?
The table names can be altered via an altered DynamoDBMapperConfig bean.
For your case where you have to Prefix each table with a literal, you can add the bean as such. Here the prefix can be the environment name in your case.
#Bean
public TableNameOverride tableNameOverrider() {
String prefix = ... // Use #Value to inject values via Spring or use any logic to define the table prefix
return TableNameOverride.withTableNamePrefix(prefix);
}
For more details check out the complete details here:
https://github.com/derjust/spring-data-dynamodb/wiki/Alter-table-name-during-runtime
I am able to achieve table names prefixed with active profile name.
First added TableNameResolver class as below,
#Component
public class TableNameResolver extends DynamoDBMapperConfig.DefaultTableNameResolver {
private String envProfile;
public TableNameResolver() {}
public TableNameResolver(String envProfile) {
this.envProfile=envProfile;
}
#Override
public String getTableName(Class<?> clazz, DynamoDBMapperConfig config) {
String stageName = envProfile.concat("_");
String rawTableName = super.getTableName(clazz, config);
return stageName.concat(rawTableName);
}
}
Then i setup DynamoDBMapper bean as below,
#Bean
#Primary
public DynamoDBMapper dynamoDBMapper(AmazonDynamoDB amazonDynamoDB) {
DynamoDBMapper mapper = new DynamoDBMapper(amazonDynamoDB,new DynamoDBMapperConfig.Builder().withTableNameResolver(new TableNameResolver(envProfile)).build());
return mapper;
}
Added variable envProfile which is an active profile property value accessed from application.properties file.
#Value("${spring.profiles.active}")
private String envProfile;
We have the same issue with regards to the need to change table names during runtime. We are using Spring-data-dynamodb 5.0.2 and the following configuration seems to provide the solutions that we need.
First I annotated my bean accessor
#EnableDynamoDBRepositories(dynamoDBMapperConfigRef = "getDynamoDBMapperConfig", basePackages = "my.company.base.package")
I also setup an environment variable called ENV_PREFIX which is Spring wired via SpEL.
#Value("#{systemProperties['ENV_PREFIX']}")
private String envPrefix;
Then I setup a TableNameOverride bean:
#Bean
public DynamoDBMapperConfig.TableNameOverride getTableNameOverride() {
return DynamoDBMapperConfig.TableNameOverride.withTableNamePrefix(envPrefix);
}
Finally, I setup the DynamoDBMapperConfig bean using TableNameOverride injection. In 5.0.2, we had to setup a standard DynamoDBTypeConverterFactory in the DynamoDBMapperConfig builder to avoid NPE.:
#Bean
public DynamoDBMapperConfig getDynamoDBMapperConfig(DynamoDBMapperConfig.TableNameOverride tableNameOverride) {
DynamoDBMapperConfig.Builder builder = new DynamoDBMapperConfig.Builder();
builder.setTableNameOverride(tableNameOverride);
builder.setTypeConverterFactory(DynamoDBTypeConverterFactory.standard());
return builder.build();
}
In hind sight, I could have setup a DynamoDBTypeConverterFactory bean that returns a standard DynamoDBTypeConverterFactory and inject that into the getDynamoDBMapperConfig() method using the DynamoDBMapperConfig builder. But this will also do the job.
I up voted the other answer but here is an idea:
Create a base class with all your user details:
#MappedSuperclass
public abstract class AbstractUser {
#Id
#GeneratedValue(strategy=GenerationType.AUTO)
private Long id;
private String firstName;
private String lastName;
Create 2 implentations with different table names and spirng profiles:
#Profile(value= {"dev","default"})
#Entity(name = "dev_user")
public class DevUser extends AbstractUser {
}
#Profile(value= {"prod"})
#Entity(name = "prod_user")
public class ProdUser extends AbstractUser {
}
Create a single JPA respository that uses the mapped super classs
public interface UserRepository extends CrudRepository<AbstractUser, Long> {
}
Then switch the implentation with the spring profile
#RunWith(SpringJUnit4ClassRunner.class)
#DataJpaTest
#Transactional
public class UserRepositoryTest {
#Autowired
protected DataSource dataSource;
#BeforeClass
public static void setUp() {
System.setProperty("spring.profiles.active", "prod");
}
#Test
public void test1() throws Exception {
DatabaseMetaData metaData = dataSource.getConnection().getMetaData();
ResultSet tables = metaData.getTables(null, null, "PROD_USER", new String[] { "TABLE" });
tables.next();
assertEquals("PROD_USER", tables.getString("TABLE_NAME"));
}
}
I'm developing a translation service that currently works inside another Service. For example:
public Profile getById(int chainId, int profileId, Integer languageId) {
Profile profile = profileRepository.getById(chainId, profileId);
translationService.translate(profile, languageId); // Here
return profile;
}
Now, to avoid to use a translate method on every service method of all the application, and as I only have the language of a user from the controller, I would like to execute the translate method before every Profile (and any other object) is returned to the client.
I tried to implement HandlerInterceptor in a custom interceptor, but it seems it doesn't returns the instance of the object that I'm returning. Anyone could help?
Another way to do it could be to translate every object that came from a select in Hibernate, but I also don't find any good solution to it this way...
The solution was to use Spring AOP. Probably the question wasn't very well explained, but what we needed was a way to intercept the object a user was asking to the backend, because they are able to create their own translations and we save them in the database. We had to return the model with the correct translation for each user, who has their localization in their profile. Here's the way we intercept it:
#Component
#Aspect
public class TranslatorInterceptor extends AccessApiController {
Logger logger = LoggerFactory.getLogger(this.getClass());
#Autowired
public TranslationService translationService;
#Pointcut("execution(* com.company.project.api.controller.*.get*(..))")
public void petitionsStartWithGet() { }
#Pointcut("execution(* com.company.project.api.controller.*.list*(..))")
public void petitionsStartWithList() { }
#Pointcut("execution(* com.company.project.api.controller.*.find*(..))")
public void petitionsStartWithFind() { }
#AfterReturning(pointcut = "petitionsStartWithGet() || petitionsStartWithList() || petitionsStartWithFind()", returning = "result")
public void getNameAdvice(JoinPoint joinPoint, Object result){
translationService.translate(result, getCustomUserDetails().getLanguageId());
logger.debug("Translating " + result.getClass().toString());
}
}
What we do here is to "watch" all the methods in the package "controller" that start by 'get', 'list' or 'find' (getById(), for example) and through this advice, we intercept the object before is sent to Jackson. The method getCustomUserDetails comes from AccessApiController, which is a class we did to provide our Controllers with some information we need.
I have a Spring Boot application combined with MongoDB as the persistance layer. I have the following structure:
public class Resource {
#Id
public String Id;
...
}
I also have a ResourceRepository:
#RepositoryRestResource(collectionResourceRel = "resources", path = "resources")
public interface ResourceRepository extends MongoRepository<Resource, String> {
Resource findById(#Param("Id")String Id);
}
I found online that a way to have the id property returned in the JSON when you perform a GET request like http://localhost:8080/resources/ is to change the id property to Id (uppercase i). Indeed, if the property is lowercase, I don't get back an id field but if I change it to uppercase then I get it. For a reason, I need to get back the id property so I used the uppercase i. So far, so good.
However, when I tried to execute the query findById included in my repository I get an exception:
org.springframework.data.mapping.context.InvalidPersistentPropertyPath: No property id found on app.model.Resource!
If I change the Id property to id (lowercase i) I can execute successfully the /resources/search/findById?id=... GET request.
I tried creating a custom controller with a query that finds and returns a Resource based on the id that is given:
#Controller
#RequestMapping("/resource")
public class ResourceController {
#Autowired
MongoOperations mongoOperations;
#RequestMapping(value="/findById/{resourceId}/", method= RequestMethod.GET)
#ResponseBody
public Resource findByResourceId(#PathVariable("resourceId") String resourceId) {
Resource resource = mongoOperations.findOne(query(Criteria.where("Id").is(resourceId)), Resource.class,"DOJ");
}
}
but I receive the same error:
org.springframework.data.mapping.context.InvalidPersistentPropertyPath: No property id found on app.model.Resource!
Any idea on how to both have the id property displyed in the JSon and be able to findById?
Well, I found the answer myself. Switch back to lowercase id so findById works and add the following class to the project:
#Configuration
public class SpringDataRestConfiguration extends RepositoryRestConfigurerAdapter {
#Override
public void configureRepositoryRestConfiguration(RepositoryRestConfiguration config) {
config.exposeIdsFor(Resource.class);
}
}
As the name of the method suggests, this configuration makes Resource class objects to expose their ids in JSON.
UPDATE: If you are using the latest or relatively latest version of spring-boot, the RepositoryRestConfigurerAdapter class has been deprecated, and the java-doc suggests to use the interface RepositoryRestConfigurer directly.
So your code should look like this:
#Configuration
public class SpringDataRestConfiguration implements RepositoryRestConfigurer
...