Can I change the default definition from 'default' to my own one. I would like the page to load and instead of it loading the 'default' it would load mine which is just called 'swagger' in this case:
I am using Spring fox and Spring boot. This is my Swagger Config class:
#Configuration
#EnableSwagger2WebMvc
#Import(SpringDataRestConfiguration.class)
public class SwaggerDocumentationConfig {
#Bean
public Docket api() {
return new Docket(DocumentationType.SWAGGER_2)
.select()
.apis(RequestHandlerSelectors.basePackage("com.openet.usage.trigger"))
.paths(PathSelectors.any())
.build();
}
private static Predicate<String> matchPathRegex(final String... pathRegexs) {
return new Predicate<String>() {
#Override
public boolean apply(String input) {
for (String pathRegex : pathRegexs) {
if (input.matches(pathRegex)) {
return true;
}
}
return false;
}
};
}
#Bean
WebMvcConfigurer configurer () {
return new WebMvcConfigurerAdapter() {
#Override
public void addResourceHandlers (ResourceHandlerRegistry registry) {
registry.addResourceHandler("/config/swagger.json").
addResourceLocations("classpath:/config");
registry
.addResourceHandler("swagger-ui.html")
.addResourceLocations("classpath:/META-INF/resources/");
registry
.addResourceHandler("/webjars/**")
.addResourceLocations("classpath:/META-INF/resources/webjars/");
}
};
}
}
It is possible to change this behavior, but it looks more like a hack.
The SwaggerResourcesProvider is responsible for providing info for the dropdown list. First, implement this interface. Second, add the Primary annotation to your class to become the main implementation that should be used instead of the default InMemorySwaggerResourcesProvider class. But it still makes sense to reuse definitions provided by InMemorySwaggerResourcesProvider and that is why it should be injected.
The last part is to implement the overridden get method and change to the list you want to display. This example should display only one definition named swagger.
// other annotations
#Primary
public class SwaggerDocumentationConfig implements SwaggerResourcesProvider {
private final InMemorySwaggerResourcesProvider resourcesProvider;
#Inject
public MySwaggerConfig(InMemorySwaggerResourcesProvider resourcesProvider) {
this.resourcesProvider = resourcesProvider;
}
#Override
public List<SwaggerResource> get() {
return resourcesProvider.get().stream()
.filter(r -> "swagger".equals(r.getName()))
.collect(Collectors.toList());
}
// the rest of the configuration
}
I just did a redirect in my controller:
#RequestMapping(value = "/", method = RequestMethod.GET)
public void redirectRootToSwaggerDocs(HttpServletResponse response) throws IOException {
response.sendRedirect("/my-api/swagger-ui.html?urls.primaryName=swagger");
}
The easiest way I found is just to make the groupName rank highly alphabetically. Such as "1 swagger", "a swagger" or "-> swagger".
...
return new Docket(DocumentationType.OAS_30)
.groupName("-> swagger");
...
...
return new Docket(DocumentationType.OAS_30)
.groupName("<what u want>")
...
just set a default group name.
Related
I need to asynchronously react to the #EventListener, therefore I've created something like this.
#Service
public class AsyncHandler {
private CompletableFuture<My> future;
#Async
public CompletableFuture<My> getMy() {
future = new CompletableFuture<>();
return future;
}
#EventListener
public void processEvent(MyEvent event) {
future.complete(event.my());
}
}
The problem here is that AsyncHandler is now stateful. Which is wrong.
And I don't want to use database, so is there any other way to make the bean stateless while using #EventListener?
You are right, your singleton has state, which is "not good".
One (possible) solution:
make the/refactor "statfeul part" to a "prototype" (request, session) scope bean.
make your "singleton" abstract!
inject the "stateful part" via "method injection" (we cannot "auto wire" lower(shorter-living) scope beans to higher ones...)
As code (example):
State holder:
public class MyStateHolder {
// State:
private CompletableFuture<My> future;
#Async // ?
public CompletableFuture<My> getMy() {
future = new CompletableFuture<>();
return future;
}
}
Abstract, (no #Service ..yet, no state!):
public abstract class AsyncHandler {
#EventListener
public void processEvent(MyEvent event) {
// !!
delegate().getMy().complete(event.my());
}
// and now only (abstract!):
public abstract MyStateHolder delegate();
}
Wiring :
#Configuration
class MyConfig {
#Bean
#Scope("prototype") // !
public MyStateHolder stateful() {
return new MyStateHolder();
}
#Bean // singleton/service:
public AsyncHandler asyncHandler() {
return new AsyncHandler() { // !
#Override // !
public MyStateHolder delegate() {
return stateful();// !;)
}
};
}
}
refs: (most of) https://docs.spring.io/spring-framework/docs/current/reference/html/core.html
Especially:
https://docs.spring.io/spring-framework/docs/current/reference/html/core.html#beans-factory-scopes-sing-prot-interaction
I am working within an environment that changes credentials every several minutes. In order for beans that implement clients who depend on these credentials to work, the beans need to be refreshed. I decided that a good approach for that would be implementing a custom scope for it.
After looking around a bit on the documentation I found that the main method for a scope to be implemented is the get method:
public class CyberArkScope implements Scope {
private Map<String, Pair<LocalDateTime, Object>> scopedObjects = new ConcurrentHashMap<>();
private Map<String, Runnable> destructionCallbacks = new ConcurrentHashMap<>();
private Integer scopeRefresh;
public CyberArkScope(Integer scopeRefresh) {
this.scopeRefresh = scopeRefresh;
}
#Override
public Object get(String name, ObjectFactory<?> objectFactory) {
if (!scopedObjects.containsKey(name) || scopedObjects.get(name).getKey()
.isBefore(LocalDateTime.now().minusMinutes(scopeRefresh))) {
scopedObjects.put(name, Pair.of(LocalDateTime.now(), objectFactory.getObject()));
}
return scopedObjects.get(name).getValue();
}
#Override
public Object remove(String name) {
destructionCallbacks.remove(name);
return scopedObjects.remove(name);
}
#Override
public void registerDestructionCallback(String name, Runnable runnable) {
destructionCallbacks.put(name, runnable);
}
#Override
public Object resolveContextualObject(String name) {
return null;
}
#Override
public String getConversationId() {
return "CyberArk";
}
}
#Configuration
#Import(CyberArkScopeConfig.class)
public class TestConfig {
#Bean
#Scope(scopeName = "CyberArk")
public String dateString(){
return LocalDateTime.now().toString();
}
}
#RestController
public class HelloWorld {
#Autowired
private String dateString;
#RequestMapping("/")
public String index() {
return dateString;
}
}
When I debug this implemetation with a simple String scope autowired in a controller I see that the get method is only called once in the startup and never again. So this means that the bean is never again refreshed. Is there something wrong in this behaviour or is that how the get method is supposed to work?
It seems you need to also define the proxyMode which injects an AOP proxy instead of a static reference to a string. Note that the bean class cant be final. This solved it:
#Configuration
#Import(CyberArkScopeConfig.class)
public class TestConfig {
#Bean
#Scope(scopeName = "CyberArk", proxyMode=ScopedProxyMode.TARGET_CLASS)
public NonFinalString dateString(){
return new NonFinalString(LocalDateTime.now());
}
}
I want to override #RepositoryRestResource autogenerated controller methods using #RepositoryRestController having set the SDR's Base Path
to "/api".
Spring Data Rest 3.0 (and earlier) says:
"This controller [as shown in the snippet] will be served from the same API base path defined in RepositoryRestConfiguration.setBasePath that is used by all other RESTful endpoints (e.g. /api)".
https://docs.spring.io/spring-data/rest/docs/3.0.1.RELEASE/reference/html/#customizing-sdr.overriding-sdr-response-handlers (chapter 15.4)
This code snippet DOES NOT have a #RequestMapping on the class level, though.
My SDR app is configured with RepositoryRestConfiguration object
config.setBasePath("/api");
and yet #RepositoryRestController doesn't override SDR's autogenerated controller methods.
Please consider the accepted answear to this post:
Spring Data Rest controllers: behaviour and usage of #BasePathAwareController, #RepositoryRestController, #Controller and #RestController
Please help me understand this! :)
AppConf.java:
#Configuration
#Import(value = {DataConf.class})
#EnableWebMvc
#ComponentScan(value = "pl.mydomain.controller")
public class AppConf
{
#Bean
public RepositoryRestConfigurer repositoryRestConfigurer() {
return new RepositoryRestConfigurerAdapter() {
public void configureRepositoryRestConfiguration(RepositoryRestConfiguration config) {
config.setBasePath("/api");
}
};
}
}
TokenController.java:
#RepositoryRestController
public class TokenController
{
private TokenRepository repository;
#Autowired
public TokenController(TokenRepository tokenRepository) {
this.repository = tokenRepository;
}
#RequestMapping(method = GET, path = "/tokens")
public #ResponseBody ResponseEntity<?> tokens()
{
return ResponseEntity.ok("Hello");
}
}
TokenRepository.java:
#RepositoryRestResource(path = "tokens")
public interface TokenRepository extends CrudRepository<Token, Long>{
}
The key to resolve the above dilemma was configuring the project in a correct fashion. That is, to put #ComponentScan in the class passed to AbstractAnnotationConfigDispatcherServletInitializer::getServletConfigClasses() method (not in AppConf.java passed to getRootConfigClasses()).
DispatcherConf.java:
public class DispatcherConf extends AbstractAnnotationConfigDispatcherServletInitializer {
#Override
protected Class<?>[] getRootConfigClasses() {
return new Class[] {AppConf.class};
}
#Override
protected Class<?>[] getServletConfigClasses() {
return new Class[] {WebConf.class}; // !!!
}
#Override
protected String[] getServletMappings() {
return new String[] {"/*"};
}
}
AppConf.java:
#Configuration
#Import({DataConf.class})
public class ApplicationConf
{
#Bean
public RepositoryRestConfigurer repositoryRestConfigurer() {
return new RepositoryRestConfigurerAdapter() {
#Override
public void configureRepositoryRestConfiguration(RepositoryRestConfiguration config) {
config.setBasePath("/api"); // !!!
}
};
}
}
DataConf.java:
#Configuration
#EnableJpaRepositories(basePackages = {
"pl.example.data.repository"
})
#EnableTransactionManagement
public class DataConf
{ ... }
WebConf.java:
#Import(RepositoryRestMvcConfiguration.class)
#ComponentScan({"pl.example.api.controller"}) // !!!
public class WebConf {
}
Even if I solved the riddle I don't understand why it was an issue. The rather that https://docs.spring.io/spring/docs/current/javadoc-api/org/springframework/context/annotation/ComponentScan.html states:
Annotation Type ComponentScan onfigures component scanning directives
for use with #Configuration classes.
I have created my own repository like that:
public interface MyRepository extends TypedIdCassandraRepository<MyEntity, String> {
}
So the question how automatically create cassandra table for that? Currently Spring injects MyRepository which tries to insert entity to non-existent table.
So is there a way to create cassandra tables (if they do not exist) during spring container start up?
P.S. It would be very nice if there is just config boolean property without adding lines of xml and creation something like BeanFactory and etc. :-)
Overide the getSchemaAction property on the AbstractCassandraConfiguration class
#Configuration
#EnableCassandraRepositories(basePackages = "com.example")
public class TestConfig extends AbstractCassandraConfiguration {
#Override
public String getKeyspaceName() {
return "test_config";
}
#Override
public SchemaAction getSchemaAction() {
return SchemaAction.RECREATE_DROP_UNUSED;
}
#Bean
public CassandraOperations cassandraOperations() throws Exception {
return new CassandraTemplate(session().getObject());
}
}
You can use this config in the application.properties
spring.data.cassandra.schema-action=CREATE_IF_NOT_EXISTS
You'll also need to Override the getEntityBasePackages() method in your AbstractCassandraConfiguration implementation. This will allow Spring to find any classes that you've annotated with #Table, and create the tables.
#Override
public String[] getEntityBasePackages() {
return new String[]{"com.example"};
}
You'll need to include spring-data-cassandra dependency in your pom.xml file.
Configure your TestConfig.class as below:
#Configuration
#PropertySource(value = { "classpath:Your .properties file here" })
#EnableCassandraRepositories(basePackages = { "base-package name of your Repositories'" })
public class CassandraConfig {
#Autowired
private Environment environment;
#Bean
public CassandraClusterFactoryBean cluster() {
CassandraClusterFactoryBean cluster = new CassandraClusterFactoryBean();
cluster.setContactPoints(env.getProperty("contactpoints from your properties file"));
cluster.setPort(Integer.parseInt(env.getProperty("ports from your properties file")));
return cluster;
}
#Bean
public CassandraConverter converter() {
return new MappingCassandraConverter(mappingContext());
}
#Bean
public CassandraSessionFactoryBean session() throws Exception {
CassandraSessionFactoryBean session = new CassandraSessionFactoryBean();
session.setCluster(cluster().getObject());
session.setKeyspaceName(env.getProperty("keyspace from your properties file"));
session.setConverter(converter());
session.setSchemaAction(SchemaAction.CREATE_IF_NOT_EXISTS);
return session;
}
#Bean
public CassandraOperations cassandraTemplate() throws Exception {
return new CassandraTemplate(session().getObject());
}
#Bean
public CassandraMappingContext mappingContext() throws ClassNotFoundException {
CassandraMappingContext mappingContext= new CassandraMappingContext();
mappingContext.setInitialEntitySet(getInitialEntitySet());
return mappingContext;
}
#Override
public String[] getEntityBasePackages() {
return new String[]{"base-package name of all your entity annotated
with #Table"};
}
#Override
protected Set<Class<?>> getInitialEntitySet() throws ClassNotFoundException {
return CassandraEntityClassScanner.scan(getEntityBasePackages());
}
}
This last getInitialEntitySet method might be an Optional one. Try without this too.
Make sure your Keyspace, contactpoints and port in .properties file. Like :
cassandra.contactpoints=localhost,127.0.0.1
cassandra.port=9042
cassandra.keyspace='Your Keyspace name here'
Actually, after digging into the source code located in spring-data-cassandra:3.1.9, you can check the implementation:
org.springframework.data.cassandra.config.SessionFactoryFactoryBean#performSchemaAction
with implementation as following:
protected void performSchemaAction() throws Exception {
boolean create = false;
boolean drop = DEFAULT_DROP_TABLES;
boolean dropUnused = DEFAULT_DROP_UNUSED_TABLES;
boolean ifNotExists = DEFAULT_CREATE_IF_NOT_EXISTS;
switch (this.schemaAction) {
case RECREATE_DROP_UNUSED:
dropUnused = true;
case RECREATE:
drop = true;
case CREATE_IF_NOT_EXISTS:
ifNotExists = SchemaAction.CREATE_IF_NOT_EXISTS.equals(this.schemaAction);
case CREATE:
create = true;
case NONE:
default:
// do nothing
}
if (create) {
createTables(drop, dropUnused, ifNotExists);
}
}
which means you have to assign CREATE to schemaAction if the table has never been created. And CREATE_IF_NOT_EXISTS dose not work.
More information please check here: Why `spring-data-jpa` with `spring-data-cassandra` won't create cassandra tables automatically?
Given the following service:
public interface MyService {
void method();
}
And it's implementation:
#Service
public class MyServiceImpl implements MyService {
#Transactional
#CustomAnnotation
#Override
public void method() {
...
}
}
I would like to use a StaticMethodMatcherPointcutAdvisor in the following manner:
public class MyPointcutAdvisor extends StaticMethodMatcherPointcutAdvisor {
...
#Override
public boolean matches(Method method, Class targetClass) {
Method m = method;
if(annotationPresent(method)) {
return true;
}
Class<?> userClass = ClassUtils.getUserClass(targetClass);
Method specificMethod = ClassUtils.getMostSpecificMethod(method, userClass);
specificMethod = BridgeMethodResolver.findBridgedMethod(specificMethod);
if(annotationPresent(specificMethod )) {
return true;
}
return false;
}
...
}
The problem is that Spring uses an InfrastructureAdvisorAutoProxyCreator to create a Proxy of that class, whereas the DefaultAdvisorAutoProxyCreator would create the proxy for the MyPointcutAdvisor, but the MyPointcutAdvisor is only given the proxy as targetClass parameter. Thus, the PointcutAdvisor cannot find the annotation and therefore does not match.
For completion this is my Configuration-class:
#Configuration
#EnableTransactionManagement
public class MyConfiguration {
#Bean
public DefaultAdvisorAutoProxyCreator defaultAdvisorAutoProxyCreator() {
return new DefaultAdvisorAutoProxyCreator();
}
#Bean
public MyPointcutAdvisor myPointcutAdvisor() {
return new MyPointcutAdvisor();
}
...
}
My question is: Is there a way to use #EnableTransactionManagement in combination with a StaticMethodMatcherPointcutAdvisor ?
Workarounds:
Put #CustomAnnotation into the service interface: I want to have clean interfaces.
Add #Role(BeanDefinition.ROLE_INFRASTRUCTURE) to MyPointCutAdvisor bean configuration, thus, the InfrastructureAdvisorAutoProxyCreator will create the proxy. This seems like the wrong way, since this bean is not infrastructure
Copy the beans from ProxyTransactionManagementConfiguration, remove #EnableTransactionManagement and remove #Role(BeanDefinition.ROLE_INFRASTRUCTURE), thus the DefaultAdvisorAutoProxyCreator will create the proxy, which is my current workaround and results in the following configuration:
#Configuration
public class MyWorkaroundConfiguration {
#Bean
public DefaultAdvisorAutoProxyCreator defaultAdvisorAutoProxyCreator() {
return new DefaultAdvisorAutoProxyCreator();
}
#Bean
public MyPointcutAdvisor myPointcutAdvisor() {
return new MyPointcutAdvisor();
}
#Bean
public TransactionAttributeSource transactionAttributeSource() {
return new AnnotationTransactionAttributeSource();
}
#Bean(name = TransactionManagementConfigUtils.TRANSACTION_ADVISOR_BEAN_NAME)
public BeanFactoryTransactionAttributeSourceAdvisor transactionAdvisor(
TransactionInterceptor transactionInterceptor) {
BeanFactoryTransactionAttributeSourceAdvisor advisor =
new BeanFactoryTransactionAttributeSourceAdvisor();
advisor.setTransactionAttributeSource(transactionAttributeSource());
advisor.setAdvice(transactionInterceptor);
return advisor;
}
#Bean
public TransactionInterceptor transactionInterceptor(
PlatformTransactionManager transactionManager) {
TransactionInterceptor interceptor = new TransactionInterceptor();
interceptor.setTransactionAttributeSource(transactionAttributeSource());
interceptor.setTransactionManager(transactionManager);
return interceptor;
}
...
}
Using #EnableAspectJAutoProxy instead of the DefaultAutoProxyCreator works for me.
#Configuration
#EnableAspectJAutoProxy
#EnableTransactionManagement
public class MyConfiguration {
}
This also allows using #Aspect like M. Deinum suggested.