Currently I am working on the spring annotation based dependency injection for Activity Worker and Workflow worker As per the documentation.I have defined my beans inside my spring boot application. Each worker is defined in the separate maven module. The issue that I am facing is that when while running my ActivityWorker spring boot module it stays active and start looking up the activities but workflow worker stops immediately after starting the module with the message '
Unregistering JMX-exposed beans on shutdown
My implementation are as following:
#Activities(version = "2.2")
#ActivityRegistrationOptions(defaultTaskScheduleToStartTimeoutSeconds = 300, defaultTaskStartToCloseTimeoutSeconds = 100)
public interface TempActivities {
public GreetWrapper getName();
public void say(String what);
/* public Integer doProcess();
public int sum(Integer num);*/
}
public class TempActivitiesImpl implements TempActivities {
GreetWrapper greetObj = new GreetWrapper();
public TempActivitiesImpl() {
// TODO Auto-generated constructor stub
}
#Override
public GreetWrapper getName() {
greetObj.setGreet("World");
return greetObj;
}
#Override
public void say(String what) {
System.out.println(what);
}
}
#Workflow(dataConverter = GreetWrapper.class)
#WorkflowRegistrationOptions(defaultExecutionStartToCloseTimeoutSeconds = 3600)
public interface TempWorkflow {
#Execute(name = "TempWorkflow", version = "2.2")
public void greet();
}
public class TempWorkflowImpl implements TempWorkflow {
private TempActivitiesClient activitiesClientImpl = new TempActivitiesClientImpl();
private DecisionContextProvider contextProvider = new DecisionContextProviderImpl();
private WorkflowClock clock
= contextProvider.getDecisionContext().getWorkflowClock();
#Override
public void greet() {
greet1(0);
}
public void greet1(int count,
Promise < ? > ...waitFor) {
if (count == 3) {
return;
}
Promise < GreetWrapper > name = activitiesClientImpl.getName();
Promise < String > greeting = getGreeting(name);
activitiesClientImpl.say(greeting);
Promise < Void > timer = clock.createTimer(30);
greet1(count + 1, timer);
}
#Asynchronous
public Promise < String > getGreeting(Promise < GreetWrapper > name) {
String greeting = "Hello " + name.get().getGreet();
System.out.println("Greeting: " + greeting);
return Promise.asPromise(greeting);
}
}
Here is my Activity Worker beans
#Configuration
public class AppConfig {
public String getActivityTasklistName() {
return "HelloWorldTaskList";
}
public String getDomainName() {
return "helloWorldWalkthrough2";
}
public String getWorkflowTasklistName() {
return "HelloWorldWorkflow";
}
public String getEndPoint() {
String endPoint = "https://swf.us-east-1.amazonaws.com";
return endPoint;
}
String swfAccessId = System.getenv("AWS_ACCESS_KEY_ID");
String swfSecretKey = System.getenv("AWS_SECRET_ACCESS_KEY");
/*#Autowired
TempActivities tempActivitiesImpl;
#Autowired
TempWorkflow tempWorkflowImpl; */
#Bean
public ClientConfiguration clientConfiguration() {
ClientConfiguration config = new ClientConfiguration();
config.withSocketTimeout(70 * 1000);
return config;
}
#Bean
public AWSCredentials basicAWSCredentials() {
BasicAWSCredentials basicAWSCredentials = new BasicAWSCredentials(swfAccessId, swfSecretKey);
return basicAWSCredentials;
}
#Bean
public AmazonSimpleWorkflow amazonSimpleWorkflowClient() {
AmazonSimpleWorkflow amazonSimpleWorkflowClient = new AmazonSimpleWorkflowClient(basicAWSCredentials(), clientConfiguration());
amazonSimpleWorkflowClient.setEndpoint(getEndPoint());
return amazonSimpleWorkflowClient;
}
#Bean
public TempActivitiesClient tempActivitiesClient() {
TempActivitiesClient tempActivitiesClient = new TempActivitiesClientImpl();
return tempActivitiesClient;
}
#Bean
public SpringActivityWorker springActivityWorker() throws InstantiationException, IllegalAccessException, SecurityException, NoSuchMethodException {
SpringActivityWorker activityWorker = new SpringActivityWorker(amazonSimpleWorkflowClient(), getDomainName(), getWorkflowTasklistName());
activityWorker.addActivitiesImplementation(new TempActivitiesImpl());
return activityWorker;
}
}
Here is my workflow worker beans
public class WorkFlowAppConfig {
public String getActivityTasklistName() {
return "HelloWorldTaskList";
}
public String getDomainName() {
return "helloWorldWalkthrough2";
}
public String getWorkflowTasklistName() {
return "HelloWorldWorkflow";
}
public String getEndPoint() {
String endPoint = "https://swf.us-east-1.amazonaws.com";
return endPoint;
}
String swfAccessId = System.getenv("AWS_ACCESS_KEY_ID");
String swfSecretKey = System.getenv("AWS_SECRET_ACCESS_KEY");
/*#Autowired
TempActivities tempActivitiesImpl;*/
#Autowired
TempWorkflow tempWorkflowImpl;
#Bean
#Scope("workflow")
public ClientConfiguration clientConfiguration() {
ClientConfiguration config = new ClientConfiguration();
config.withSocketTimeout(70 * 1000);
return config;
}
#Bean
#Scope("workflow")
public AWSCredentials basicAWSCredentials() {
BasicAWSCredentials basicAWSCredentials = new BasicAWSCredentials(swfAccessId, swfSecretKey);
return basicAWSCredentials;
}
#Bean
#Scope("workflow")
public AmazonSimpleWorkflow amazonSimpleWorkflowClient() {
AmazonSimpleWorkflow amazonSimpleWorkflowClient = new AmazonSimpleWorkflowClient(basicAWSCredentials(), clientConfiguration());
amazonSimpleWorkflowClient.setEndpoint(getEndPoint());
return amazonSimpleWorkflowClient;
}
#Bean
#Scope("workflow")
public TempActivitiesClient activitiesClientImpl() {
return new TempActivitiesClientImpl();
}
#Bean
#Scope("workflow")
public SpringWorkflowWorker springWorkflowWorker() throws InstantiationException, IllegalAccessException {
SpringWorkflowWorker workflowWorker = new SpringWorkflowWorker(amazonSimpleWorkflowClient(), getDomainName(), getWorkflowTasklistName());
workflowWorker.addWorkflowImplementation(tempWorkflowImpl);
workflowWorker.setRegisterDomain(true);
// workflowWorker.setDomainRetentionPeriodInDays(1);
return workflowWorker;
}
#Bean
public CustomScopeConfigurer customScope() {
CustomScopeConfigurer configurer = new CustomScopeConfigurer();
Map < String, Object > workflowScope = new HashMap < String, Object > ();
workflowScope.put("workflow", new WorkflowScope());
configurer.setScopes(workflowScope);
return configurer;
}
}
Related
I am building an Spring Batch application that loads fetches data from external API and loading it to Database. I am making a REST call to external API which API return JSON/CSV string based on request parameters(json/csv).
The application currently works fine for CSV file saved in filesystem. I am trying to get rid of creating a new file and using the file as input every time. I would want to achieve the loading without having to create any file on the disk.
I have googled and tried different solutions, I have replaced FlatFileItemReader with JsonItemReader.
Can you please help me here.
ReportService.java
#Slf4j
#Service
public class ReportService {
public static final String OUTPUT_FILE_FORMAT = "csv";
#Autowired
ProxyRestClient proxyRestClient;
#Autowired
FileOperations fileWriter;
#Autowired
CommonUtility commonUtil;
#Autowired
ReportMapping reportMapping;
#Autowired
ReportConfig reportConfig;
#Autowired
JobMapping jobMap;
#Autowired
DataIntegrationResponseBuilder responseBuilder;
#Autowired
ImportJobLauncher batchJob;
#Autowired
WFitsPricingParser wFitsPricingParser;
public ResponseEntity<DataIntegrationResponse> getReport(String jobName) throws Exception {
HashMap<String, String> fitsParamMap = new HashMap<String, String>();
fitsParamMap = getFitsParameters(jobName);
ResponseEntity<String> response;
final String filePath = Path.of("").toAbsolutePath().toString() + "\\";
final String baseUrl = reportConfig.getBaseURL();
String reportURL = baseUrl + fitsParamMap.get("reportId") + "?format=" + OUTPUT_FILE_FORMAT;
String reportName = fitsParamMap.get("reportName");
log.info("Fetching report from FITS API.");
log.info("FITS API URL: " + reportURL + ".");
response = proxyRestClient.callFitsApi(reportURL, commonUtil.encodedCredentials());
/*
* ObjectMapper mapper = new ObjectMapper(); WFitsVendor[] jsonObj =
* mapper.readValue(response.getBody(), WFitsVendor[].class);
*/
if (response != null) {
if (response.getStatusCodeValue() == 200 && response.hasBody()) {
fileWriter.writeToFile(response.getBody(), filePath + reportName + "." + OUTPUT_FILE_FORMAT);
if (jobName.equals("vendor")) {
BatchJobResponse batchJobResponse = batchJob.importFitsVendor(filePath + reportName + "." + OUTPUT_FILE_FORMAT, response.getBody());
return new ResponseEntity<>(responseBuilder.buildResponse(response.getStatusCode(),
reportName + "." + OUTPUT_FILE_FORMAT, batchJobResponse.getJobName(), batchJobResponse.getJobStatus(), batchJobResponse.getJobId()),
HttpStatus.CREATED);
}
}
}
return new ResponseEntity<>(responseBuilder.buildResponse(HttpStatus.NOT_FOUND, "", "", "", 0 L),
HttpStatus.NOT_FOUND);
}
private HashMap<String, String> getFitsParameters(String jobName) {
HashMap<String, String> fitsParamMap = new HashMap<String, String>();
fitsParamMap.put("reportName", reportMapping.getMappings().getOrDefault(jobMap.getMappings().get(jobName), ""));
fitsParamMap.put("reportId", jobMap.getMappings().getOrDefault(jobName, ""));
return fitsParamMap;
}
}
ProxyRestClient.java
#Service
public class ProxyRestClient {
#Autowired
#Qualifier("externalRestTemplate")
RestTemplate restTemplate;
public ResponseEntity<String> callFitsApi(
String reportURL, String encodedCredentials)
throws JsonMappingException, JsonProcessingException {
HttpHeaders headers = new HttpHeaders();
headers.add("Authorization", "Basic " + encodedCredentials);
HttpEntity<String> request = new HttpEntity<>(headers);
ResponseEntity<String> response = null;
ResponseEntity<String> responseWfits = null;
// response = restTemplate.exchange(reportURL, HttpMethod.GET, request,
// String.class);
responseWfits = restTemplate.exchange(https://fitsonline.trgrp.com/msmdsqa/api/report/user/5010296?format=json, HttpMethod.GET, request, String.class);
return responseWfits;
}
}
VendorJob.java
#Configuration
#EnableBatchProcessing
#AllArgsConstructor
public class VendorJob {
private JobBuilderFactory jobBuilderFactory;
private StepBuilderFactory stepBuilderFactory;
private static final String DROP_SCRIPT = "TRUNCATE TABLE MDIA.WFITS_VENDOR";
#Autowired
private HikariDataSource dataSource;
/*
* This is for CSV File
*
* #Bean
* #StepScope
public FlatFileItemReader<WFitsVendor>
* VendorReader(#Value("#{jobParameters['filePath']}") String filePath,
#Value("#{jobParameters['jsonObj']}") String jsonObj) {
FlatFileItemReader<WFitsVendor> itemReader = new FlatFileItemReader<>();
itemReader.setResource(new FileSystemResource(filePath));
itemReader.setName("csvReader"); itemReader.setLinesToSkip(1);
itemReader.setLineMapper(lineMapper());
itemReader.setRecordSeparatorPolicy(new ReaderPolicy());
return itemReader; }
*/
#Bean
#StepScope
public JsonItemReader<WFitsVendor> jsonItemReader(#Value("#{jobParameters['filePath']}") String filePath,
#Value("#{jobParameters['jsonObj']}") String jsonObj) {
ObjectMapper objectMapper = new ObjectMapper();
// configure the objectMapper as required
JacksonJsonObjectReader<WFitsVendor> jsonObjectReader =
new JacksonJsonObjectReader<>(WFitsVendor.class);
jsonObjectReader.setMapper(objectMapper);
return new JsonItemReaderBuilder<WFitsVendor>()
.jsonObjectReader(jsonObjectReader)
.resource(new ByteArrayResource(jsonObj.getBytes()))
.name("jsonItemReader")
.build();
}
private LineMapper<WFitsVendor> lineMapper() {
WFitsVendorLineMapper<WFitsVendor> lineMapper = new WFitsVendorLineMapper<>();
DelimitedLineTokenizer lineTokenizer = new DelimitedLineTokenizer();
lineTokenizer.setDelimiter(",");
lineTokenizer.setNames("VENDORNAME", "Type", "Notes");
lineTokenizer.setStrict(true);
lineTokenizer.setIncludedFields(0, 1, 2);
BeanWrapperFieldSetMapper<WFitsVendor> fieldSetMapper = new BeanWrapperFieldSetMapper<>();
fieldSetMapper.setTargetType(WFitsVendor.class);
lineMapper.setLineTokenizer(lineTokenizer);
lineMapper.setFieldSetMapper(fieldSetMapper);
return lineMapper;
}
#Bean
public VendorProcessor VendorProcessor() {
return new VendorProcessor();
}
#Bean
public JdbcBatchItemWriter<WFitsVendor> VendorWriter() {
JdbcBatchItemWriter<WFitsVendor> databaseItemWriter = new JdbcBatchItemWriter<>();
databaseItemWriter.setDataSource(dataSource);
databaseItemWriter.setSql(
"INSERT INTO MDIA.WFITS_VENDOR(VENDOR, VENDOR_INFO, VENDOR_TYPE, CREATED_BY) VALUES (?, ?, ?, ?)");
ItemPreparedStatementSetter<WFitsVendor> valueSetter = new WFitsVendorPreparedStatementSetter();
databaseItemWriter.setItemPreparedStatementSetter(valueSetter);
return databaseItemWriter;
}
#Bean
public Step loadVendorTable() {
return stepBuilderFactory.get("load-wfitsvendor-table").<WFitsVendor, WFitsVendor> chunk(10000)
.reader(jsonItemReader(null, null)).writer(VendorWriter()).processor(VendorProcessor()).faultTolerant()
.taskExecutor(VendortaskExecutor()).build();
}
#Bean
public Step truncateVendorTable() {
return stepBuilderFactory.get("truncate-wfitsvendor-table").tasklet(truncateTableTasklet()).build();
}
public Tasklet truncateTableTasklet() {
return (contribution, chunkContext) -> {
new JdbcTemplate(dataSource).execute(DROP_SCRIPT);
return RepeatStatus.FINISHED;
};
}
#Bean
#Qualifier("VendorJob")
public Job runVendorJob() {
return jobBuilderFactory.get("VendorJob").listener(new JobCompletionListener()).start(truncateVendorTable())
.next(loadVendorTable()).build();
}
#Bean
public TaskExecutor VendortaskExecutor() {
return new ConcurrentTaskExecutor(Executors.newCachedThreadPool());
}
}
}
VendorLineMapper.java
public class VendorLineMapper<T> implements LineMapper<WFitsVendor> , InitializingBean {
private LineTokenizer tokenizer;
private FieldSetMapper<WFitsVendor> fieldSetMapper;
#Override
public WFitsVendor mapLine(String line, int lineNumber) throws Exception {
WFitsVendor vr = fieldSetMapper.mapFieldSet(tokenizer.tokenize(line));
//System.out.println(line);
vr.setLineNo(lineNumber);
return vr;
}
public void setLineTokenizer(LineTokenizer tokenizer) {
this.tokenizer = tokenizer;
}
public void setFieldSetMapper(FieldSetMapper<WFitsVendor> fieldSetMapper) {
this.fieldSetMapper = fieldSetMapper;
}
#Override
public void afterPropertiesSet() {
Assert.notNull(tokenizer, "The LineTokenizer must be set");
Assert.notNull(fieldSetMapper, "The FieldSetMapper must be set");
}
}
ReaderPolicy.java
public class ReaderPolicy extends DefaultRecordSeparatorPolicy {
#Override
public boolean isEndOfRecord(final String line) {
return line.trim().length() != 0 && super.isEndOfRecord(line);
}
#Override
public String postProcess(final String record) {
if (record == null || record.trim().length() == 0) {
return null;
}
return super.postProcess(record);
}
JSON Returned from API:
You can use a URLResource and point your JSON item reader to it, something like:
#Bean(destroyMethod = "close")
public InputStream urlResource() throws IOException {
URL url = new URL("https://path.to.your.resource");
URLConnection urlConnection = url.openConnection();
// urlConnection.setRequestProperty("", ""); // set auth headers if necessary
return urlConnection.getInputStream();
}
#Bean
public JsonItemReader<Pojo> itemReader() throws IOException {
return new JsonItemReaderBuilder<Pojo>()
.name("restReader")
.resource(new InputStreamResource(urlResource()))
.strict(true)
.jsonObjectReader(new JacksonJsonObjectReader<>(Pojo.class))
.build();
}
My project uses springboot+springDataJpa+shiro.
Because my server database uses the master and salve method, so I need to call my code to connect to the two databases, I designed to use the AbstractRoutingDataSource + aop method. Now I have a problem, I think it may be caused by shiro.
I know that the connection switching is performed by the getconnection() method of AbstractRoutingDataSource, and I cannot manually control this method. The problem now is that my getconnection() is executed at most twice in an interface request. Let me post my code and describe it:
#Order(0)
#Aspect
#Component
public class RoutingAopAspect {
#Around("#annotation(targetDataSource)")
public Object routingWithDataSource(ProceedingJoinPoint joinPoint, TargetDataSource targetDataSource) throws Throwable {
try {
DynamicRoutingDataSourceContext.setRoutingDataSource(targetDataSource.value());
return joinPoint.proceed();
} finally {
DynamicRoutingDataSourceContext.removeRoutingDataSource();
}
}
}
public class DynamicRoutingDataSourceContext {
public static final String MASTER = "master";
public static final String SLAVE = "slave";
private static final ThreadLocal<Object> threadLocalDataSource = new ThreadLocal<>();
public static void setRoutingDataSource(Object dataSource) {
if (dataSource == null) {
throw new NullPointerException();
}
threadLocalDataSource.set(dataSource);
// System.err.println(Thread.currentThread().getName()+" set RoutingDataSource : " + dataSource);
}
public static Object getRoutingDataSource() {
Object dataSourceType = threadLocalDataSource.get();
if (dataSourceType == null) {
threadLocalDataSource.set(DynamicRoutingDataSourceContext.MASTER);
return getRoutingDataSource();
}
// System.err.println(Thread.currentThread().getName()+" get RoutingDataSource : " + dataSourceType);
return dataSourceType;
}
public static void removeRoutingDataSource() {
threadLocalDataSource.remove();
// System.err.println(Thread.currentThread().getName()+" remove RoutingDataSource");
}
}
#EnableTransactionManagement
#Configuration
public class DataSourceConfig {
#Value("${datasource.master.url}")
private String masterUrl;
#Value("${datasource.master.username}")
private String masterUsername;
#Value("${datasource.master.password}")
private String masterPassword;
#Value("${dataSource.driverClass}")
private String masterDriverClassName;
#Value("${datasource.slave.url}")
private String slaveUrl;
#Value("${datasource.slave.username}")
private String slaveUsername;
#Value("${datasource.slave.password}")
private String slavePassword;
#Value("${dataSource.driverClass}")
private String slaveDriverClassName;
#Bean(name = "masterDataSource")
public DataSource masterDataSource(){
DruidDataSource datasource = new DruidDataSource();
datasource.setUrl(masterUrl);
datasource.setUsername(masterUsername);
datasource.setPassword(masterPassword);
datasource.setDriverClassName(masterDriverClassName);
return datasource;
}
#Bean(name = "slaveDataSource")
public DataSource slaveDataSource(){
DruidDataSource datasource = new DruidDataSource();
datasource.setUrl(slaveUrl);
datasource.setUsername(slaveUsername);
datasource.setPassword(slavePassword);
datasource.setDriverClassName(slaveDriverClassName);
return datasource;
}
#Primary
#Bean
public DynamicRoutingDataSource dynamicDataSource(#Qualifier(value = "masterDataSource") DataSource masterDataSource,
#Qualifier(value = "slaveDataSource") DataSource slaveDataSource) {
Map<Object, Object> targetDataSources = new HashMap<>(2);
targetDataSources.put(DynamicRoutingDataSourceContext.MASTER, masterDataSource);
targetDataSources.put(DynamicRoutingDataSourceContext.SLAVE, slaveDataSource);
DynamicRoutingDataSource dynamicRoutingDataSource = new DynamicRoutingDataSource();
dynamicRoutingDataSource.setTargetDataSources(targetDataSources);
dynamicRoutingDataSource.setDefaultTargetDataSource(masterDataSource);
dynamicRoutingDataSource.afterPropertiesSet();
return dynamicRoutingDataSource;
}
}
public class DynamicRoutingDataSourceContext {
public static final String MASTER = "master";
public static final String SLAVE = "slave";
private static final ThreadLocal<Object> threadLocalDataSource = new ThreadLocal<>();
public static void setRoutingDataSource(Object dataSource) {
if (dataSource == null) {
throw new NullPointerException();
}
threadLocalDataSource.set(dataSource);
// System.err.println(Thread.currentThread().getName()+" set RoutingDataSource : " + dataSource);
}
public static Object getRoutingDataSource() {
Object dataSourceType = threadLocalDataSource.get();
if (dataSourceType == null) {
threadLocalDataSource.set(DynamicRoutingDataSourceContext.MASTER);
return getRoutingDataSource();
}
// System.err.println(Thread.currentThread().getName()+" get RoutingDataSource : " + dataSourceType);
return dataSourceType;
}
public static void removeRoutingDataSource() {
threadLocalDataSource.remove();
// System.err.println(Thread.currentThread().getName()+" remove RoutingDataSource");
}
}
This is the relevant basic configuration of AbstractRoutingDataSource.
I defined an aspect to get the parameters of #TargetDataSource in the method. This parameter is a data source that needs to be executed currently. I think there is no problem with my configuration.
Then I will use #TargetDataSource on my service method, and I use shiro, shiro’s doGetAuthorizationInfo() method and doGetAuthenticationInfo() are executed before my service, and both methods need to call my userservice .
Then the problem now is that after calling the doGetAuthorizationInfo() and doGetAuthenticationInfo() methods, they will automatically execute the getconnection() method of AbstractRoutingDataSource to switch the data source, and then execute to my own service, it will not execute the getconnection() method. , This is what I said getconnection() is executed at most twice in an interface request.
#Slf4j
#Component
public class ShiroRealm extends AuthorizingRealm {
#Autowired
#Lazy
private UserService userService;
#Autowired
CacheUtil cacheUtil;
#Override
public boolean supports(AuthenticationToken token) {
return token instanceof JwtToken;
}
#Override
protected AuthorizationInfo doGetAuthorizationInfo(PrincipalCollection principals) {
String username = JwtUtil.getClaim(principals.toString(), "username");
User user = userService.getUserByUsername(username);
SimpleAuthorizationInfo simpleAuthorizationInfo = new SimpleAuthorizationInfo();
simpleAuthorizationInfo.addRole(user.getRole());
return simpleAuthorizationInfo;
}
#Override
protected AuthenticationInfo doGetAuthenticationInfo(AuthenticationToken auth) {
String token = (String) auth.getCredentials();
String username = JwtUtil.getClaim(token, "username");
if (username == null) {
throw new AuthenticationException("token invalid");
}
User user = userService.getUserByUsername(username);
if (user == null) {
throw new AuthenticationException("User didn't existed!");
}
if (JwtUtil.verify(token, username, user.getPassword(), TokenType.ACCESS_TOKEN) &&
cacheUtil.hasKey(CacheKey.ACCESS_TOKEN_KEY + token)
) {
return new SimpleAuthenticationInfo(token, token, "userRealm");
}
throw new AuthenticationException("Token expired or incorrect");
}
}
#Service
public class PageServiceImpl implements PageService {
#Autowired
PageRepository pageRepository;
#Override
#TargetDataSource("slave")
#Transactional(rollbackFor = Exception.class)
public List<Page> adminFindAll() {
List<Page> pageList = pageRepository.findAll();
if (pageList.isEmpty()) {
throw new CustomNotFoundException("page list not found");
}
return pageList;
}
}
I don’t know if my description is clear. If it is not clear, please ask questions. I hope to get your help, thanks very much!
I am using AWS Rekognition to build an application and I have realized that every time i make a request to the service a connection to aws keep being reestablished which is slowing down performance. Is there any way to make one single connection that persists throughout the session? My code can be seen below:
private static final AmazonRekognition rekognitionClient = RekognitionUtil.setupRekognitionClient();
private static AWSCredentialsProvider setupCredentials(String accessKey, String secretKey) {
AWSCredentialsProvider provider = new AWSCredentialsProvider() {
#Override
public AWSCredentials getCredentials() {
return new AWSCredentials() {
#Override
public String getAWSAccessKeyId() {
LOG.info("Access key: " + ConfigUtil.getString(ConfigConstants.CONFIG_REKOGNITION_ACCESS_KEY,accessKey));
return ConfigUtil.getString(ConfigConstants.CONFIG_REKOGNITION_ACCESS_KEY,accessKey);
}
#Override
public String getAWSSecretKey() {
LOG.info("Secret key: " + ConfigUtil.getString(ConfigConstants.CONFIG_REKOGNITION_SECRET_KEY,secretKey));
return ConfigUtil.getString(ConfigConstants.CONFIG_REKOGNITION_SECRET_KEY,secretKey);
}
};
}
#Override
public void refresh() {
}
};
return provider;
}
private static AmazonRekognition setupRekognitionClient() {
AWSCredentialsProvider provider = setupCredentials("xxxx", "xxxx");
return AmazonRekognitionClientBuilder.standard().withCredentials(provider).withRegion(ConfigUtil.getString(ConfigConstants.CONFIG_REKOGNITION_REGION,"xxx")).build();
}
private static AWSCredentialsProvider setupCredentials(String accessKey, String secretKey) {
AWSCredentialsProvider provider = new AWSCredentialsProvider() {
#Override
public AWSCredentials getCredentials() {
return new AWSCredentials() {
#Override
public String getAWSAccessKeyId() {
LOG.info("Access key: " + ConfigUtil.getString(ConfigConstants.CONFIG_REKOGNITION_ACCESS_KEY,accessKey));
return ConfigUtil.getString(ConfigConstants.CONFIG_REKOGNITION_ACCESS_KEY,accessKey);
}
#Override
public String getAWSSecretKey() {
LOG.info("Secret key: " + ConfigUtil.getString(ConfigConstants.CONFIG_REKOGNITION_SECRET_KEY,secretKey));
return ConfigUtil.getString(ConfigConstants.CONFIG_REKOGNITION_SECRET_KEY,secretKey);
}
};
}
#Override
public void refresh() {
}
};
return provider;
}
private static AmazonRekognition setupRekognitionClient() {
AWSCredentialsProvider provider = setupCredentials("xxxx", "xxx");
return AmazonRekognitionClientBuilder.standard().withCredentials(provider).withRegion(ConfigUtil.getString(ConfigConstants.CONFIG_REKOGNITION_REGION,"xxx")).build();
}
public static String searchCollectionByFace(String collectionId, ByteBuffer sourceByteBuffer) throws Exception {
LOG.info("Searching face collection by face...");
String faceId = "";
try {
ObjectMapper objectMapper = new ObjectMapper();
// Get an image object from S3 bucket.
Image image = new Image().withBytes(sourceByteBuffer);
// Search collection for faces similar to the largest face in the image.
SearchFacesByImageRequest searchFacesByImageRequest = new SearchFacesByImageRequest().withCollectionId(collectionId).withImage(image).withFaceMatchThreshold(70F).withMaxFaces(2);
SearchFacesByImageResult searchFacesByImageResult = rekognitionClient.searchFacesByImage(searchFacesByImageRequest);
List<FaceMatch> faceImageMatches = searchFacesByImageResult.getFaceMatches();
for (FaceMatch face : faceImageMatches) {
LOG.info(face.getFace().getFaceId());
if(face.getFace().getConfidence() > SIMILARITY_LIMIT){
faceId = face.getFace().getFaceId();
}
}
return faceId;
} catch (Exception ex) {
LOG.error("Error has occurred searching for face", ex);
throw new Exception();
}
}
You can try to fine tune:
Max connections
Max connection idle time
in the client configuration you pass to the client.
When my code try to deserialize ResponseEntity to REDIS i have this error:
nested exception is com.fasterxml.jackson.databind.JsonMappingException: Can not construct instance of org.springframework.http.ResponseEntity: no suitable constructor found, can not deserialize from Object value (missing default constructor or creator, or perhaps need to add/enable type information?)
My Controller:
#Cacheable(value = "restapi", keyGenerator = "customKeyGenerator")
#RequestMapping(value = "/restapi/{id}", method = RequestMethod.GET, produces = { "application/json;charset=UTF-8" })
#ResponseStatus(HttpStatus.OK)
public ResponseEntity<Object> findRestApi(#PathVariable("id") Integer id) {
RestModel model= new RestModel ();
model = restService.findRestById(id);
return new ResponseEntity<>(produto, headers, HttpStatus.OK);
}
My Redis Configuration Class
#Configuration
#EnableCaching
public class CacheConfig extends CachingConfigurerSupport {
#Value("${spring.redis.host}")
private String host;
#Value("${spring.redis.port}")
private Integer port;
#Bean
public JedisConnectionFactory redisConnectionFactory() {
JedisConnectionFactory redisConnectionFactory = new JedisConnectionFactory();
redisConnectionFactory.setHostName(host);
redisConnectionFactory.setPort(port);
return redisConnectionFactory;
}
#Bean
RedisTemplate<String, Object> redisTemplate(RedisConnectionFactory rcf) {
RedisTemplate<String, Object> template = new RedisTemplate<>();
template.setConnectionFactory(rcf);
template.setKeySerializer(new StringRedisSerializer());
template.setValueSerializer(new JsonRedisSerializer());
return template;
}
static class JsonRedisSerializer implements RedisSerializer<Object> {
private final ObjectMapper om;
public JsonRedisSerializer() {
this.om = new ObjectMapper().enableDefaultTyping(DefaultTyping.NON_FINAL, As.PROPERTY);
}
#Override
public byte[] serialize(Object t) throws SerializationException {
try {
return om.writeValueAsBytes(t);
} catch (JsonProcessingException e) {
throw new SerializationException(e.getMessage(), e);
}
}
#Override
public Object deserialize(byte[] bytes) throws SerializationException {
if(bytes == null){
return null;
}
try {
return om.readValue(bytes, Object.class);
} catch (Exception e) {
throw new SerializationException(e.getMessage(), e);
}
}
}
#Bean
public CacheManager cacheManager(RedisTemplate redisTemplate) {
RedisCacheManager cacheManager = new RedisCacheManager(redisTemplate);
cacheManager.setDefaultExpiration(300);
return cacheManager;
}
#Bean
public KeyGenerator customKeyGenerator() {
return new KeyGenerator() {
#Override
public Object generate(Object o, Method method, Object... objects) {
StringBuilder sb = new StringBuilder();
sb.append(o.getClass().getName());
sb.append(method.getName());
for (Object obj : objects) {
sb.append(obj.toString());
}
return sb.toString();
}
};
}
}
Someone know what i can do? Thanks!!
AdminSOAPRunner:
#Component
public class AdminSOAPRunner {
private static final Logger LOGGER = LoggerFactory.getLogger(AdminSOAPRunner.class);
private String userId;
public String getUserId() {
return userId;
}
public void setUserId(String userId) {
this.userId = userId;
}
#Autowired
private AdminAuth adminAuthenticator;
#Autowired
private AdminBean adminBean;
private AccountService accountService;
private void setBindingProviderByAccountService() {
WSBindingProvider bindingProvider = (WSBindingProvider) this.accountService;
bindingProvider.getRequestContext().put(BindingProvider.ENDPOINT_ADDRESS_PROPERTY, adminBean.getAccountUrl());
LOGGER.info("Endpoint {}", adminBean.getAccountUrl());
}
private RequestInfo getRequestInfo() {
RequestInfo requestInfo = new RequestInfo();
requestInfo.setAppName(adminBean.getAppName());
requestInfo.setUserId(this.getUserId());
requestInfo.setTrace(UUID.randomUUID().toString());
return requestInfo;
}
public List<ApplyAccountResult> getAccounts(ApplyAccountRequest request) {
AccountService_Service service = null;
URL serviceWSDL = AccountService_Service.class.getResource("/Account-service/Account-service.wsdl");
service = new AccountService_Service(serviceWSDL);
SOAPHandlerResolver SOAPHandlerResolver = new SOAPHandlerResolver();
SOAPHandlerResolver.getHandlerList().add(new SOAPHandler(this.adminAuthenticator));
service.setHandlerResolver(SOAPHandlerResolver);
if (accountService == null) {
accountService = service.getAccountService();
}
setBindingProviderByAccountService();
ApplyAccountAccountResponse response = null;
LOGGER.info("Making a SOAP request.");
response = AccountService.applyAccount(request, getRequestInfo(), new Holder<ResponseInfo>());
LOGGER.info("SOAP request completed.");
return response.getApplyAccountResults();
}
SOAPHandlerResolver:
public class SOAPHandlerResolver implements HandlerResolver {
#SuppressWarnings("rawtypes")
private List<Handler> handlerList;
public SOAPHandlerResolver() {
this.handlerList = null;
}
#SuppressWarnings("rawtypes")
public List<Handler> getHandlerList() {
if (this.handlerList == null) {
this.handlerList = new ArrayList<>();
}
return this.handlerList;
}
#SuppressWarnings("rawtypes")
#Override
public List<Handler> getHandlerChain(PortInfo portInfo) {
List<Handler> handlerChain = new ArrayList<>();
if (this.handlerList == null || this.handlerList.isEmpty()) {
this.handlerList = new ArrayList<>();
this.handlerList.add(new SOAPHandler(null));
}
handlerChain.addAll(this.handlerList);
return handlerChain;
}
}
SOAPHandler
public class SOAPHandler implements SOAPHandler<SOAPMessageContext> {
private AdminAuth adminAuth;
private static final Logger LOGGER = LoggerFactory.getLogger(SOAPHandler.class);
public MosaicOnboardSOAPHandler(AdminAuth adminAuth) {
if (adminAuth == null) {
adminAuth = new AdminAuth();
LOGGER.info("AdminAuth found null. Creating new adminAuth instance.");
}
this.adminAuth = adminAuth;
}
#Override
public boolean handleMessage(SOAPMessageContext context) {
Boolean outboundProperty = (Boolean) context.get(MessageContext.MESSAGE_OUTBOUND_PROPERTY);
if (outboundProperty) {
#SuppressWarnings("unchecked")
Map<String, List<String>> headers = (Map<String, List<String>>) context.get(MessageContext.HTTP_REQUEST_HEADERS);
if (headers == null) {
headers = new HashMap<>();
context.put(MessageContext.HTTP_REQUEST_HEADERS, headers);
}
List<String> cookie = headers.get("Cookie");
if (cookie == null) {
cookie = new ArrayList<>();
headers.put("Cookie", cookie);
}
cookie.add(this.adminAuth.getToken());
}
return true;
}
#Override
public boolean handleFault(SOAPMessageContext context) {
return false;
}
#Override
public void close(MessageContext context) {
}
#Override
public Set<QName> getHeaders() {
return null;
}
}
AdminAuth:
#Component
public class AdminAuth {
#Autowired
private AdminBean adminBean;
private static final Logger LOG = LoggerFactory.getLogger(Admin.class);
private String token;
private void generateToken() {
try {
AdminTokenHelper adminTokenHelper = new AdminTokenHelper(adminBean.getAutheticationServerURL(), adminBean.getLicense());
token = adminTokenHelper.getToken(adminBean.getUsername(), adminBean.getPassword().toCharArray());
LOG.info("Token generation successful");
} catch (Exception ex) {
ex.printStackTrace();
LOG.error("Token generation failed");
LOG.error(ex.getMessage());
throw new RuntimeException("Token generation failed", ex);
}
}
#Cacheable(value = "tokenCache")
public String getToken() {
LOG.warn("Token not available. Generating a new token.");
generateToken();
return token;
}
}
ehcache.xml
<cache name="tokenCache" maxEntriesLocalHeap="1" eternal="false" timeToIdleSeconds="895" timeToLiveSeconds="895" memoryStoreEvictionPolicy="LRU"/>
Applcation
#EnableCaching
#SpringBootApplication
public class Application extends SpringBootServletInitializer {
public static void main(final String[] args) {
SpringApplication.run(Application.class, args);
}
#Override
protected SpringApplicationBuilder configure(final SpringApplicationBuilder application) {
return application.sources(Application.class).profiles(determineEnvironmentProfile());
}
}
In AdminAuth, it uses functional user to generate token. the token generated for authentication expires in 15 minutes. So my purpose was to write cache so that all the calls from ui can use the same token regardless of actual user. So i set the time 14:55 to generate new token. Now the problem comes when it's after 15 minutes and the cache doesn't evict the old toeken so that call uses the old and expired token and it fails.
I tried different eviction policies like LRU, LFU, FiFO but nothing is working. The calls are coming from ui through tomcat container in spring boot 1.3.
Why is this not getting evicted? What am i missing? Any help is appreciated
Replace #Cacheable(value = "tokenCache") with #Cacheable("tokenCache")
From the comments:
The dependency on spring-boot-starter-cache was missing. This prevented Spring Boot from automatically configuring the CacheManager. Once this dependency was added, the cache configuration worked.
See http://docs.spring.io/spring-boot/docs/1.3.x/reference/html/boot-features-caching.html