Helloc
I am trying to create webscoket endpoint using Spring WebFlux. I want this endpoint to return some events.
In order to do so I created ConnectableFlux of events and in handle(..) method I map it to Flux. But after I give it to WebSocketSession, nothing happens - websocket client doesn't receive anything. But at the same time println(event.toString()) which you can see in my handle(..) method below actually prints information to console.
Could you please tell what am I missing?
public class EventWebsocketHandler implements WebSocketHandler {
// constructors and etc.
#Override
public Mono<Void> handle(WebSocketSession session) {
ObjectMapper objectMapper = new ObjectMapper();
Flux<WebSocketMessage> messages = eventService.events()
.flatMap(event -> {
try {
System.out.println(event.toString());
return Mono.just(objectMapper.writeValueAsString(event));
} catch (JsonProcessingException e) {
return Mono.error(e);
}
})
.map(session::textMessage);
return session.send(messages);
}
#Service
public class EventService {
List<EventDto> events = new ArrayList<>();
private final Flux<EventDto> eventFlux = Flux.<EventDto>create(fluxSink -> {
while (true) {
if (!events.isEmpty()) {
fluxSink.next(events.get(0));
events.remove(0);
}
}
})
.publish()
.autoConnect();
public void push(EventDto event) {
events.add(event);
}
public Flux<EventDto> events() {
return eventFlux;
}
}
I have another WebSocketHandler in my project and it works fine, which means everything is OK with configuration:
public class MyWebSocketHandler implements WebSocketHandler {
#Override
public Mono<Void> handle(WebSocketSession session) {
Flux<Long> source = Flux.interval(Duration.ofMillis(1000 * 3));
return session.send(source.map(l -> session.textMessage(String.valueOf(l))));
}
}
This
private final Flux<EventDto> eventFlux = Flux.<EventDto>create(fluxSink -> {
while (true) {
if (!events.isEmpty()) {
fluxSink.next(events.get(0));
events.remove(0);
}
}
})
.publish()
.autoConnect();
Has to be replaces with this
private final Sinks.Many<EventDto> processor = Sinks.many().multicast().onBackpressureBuffer();
Related
I have a web application that uses Kafka and Spring WebSocket for emitting different events to a mobile app. Events are triggered based on the logic where mobile app is online or offline.
The logic works perfectly fine if an event is triggered when mobile app is already online. But if it's offline, the triggered event is stored in a postgres table and marked with status PENDING. So when the mobile app comes back online, these table searches for the message and then triggers the event.
In order to update the mobile status, I am using Socket Sessions, and ChannelInterceptor which updates the mobile app status on SUBSCRIBE and DISCONNECT event.
I have logs at every possible place, and all the logs are executed when the mobile app comes back online. There's no error anywhere. The last log is right before :
messagingTemplate.convertAndSendToUser();
Logs :
Device Status in request : true
Send Mode : IMMEDIATE
inside immediate never expire >>>>>
Enter: SocketService.send() with argument[s] = [{Status=200, Response={PayloadId=1415, PayloadExpiryDateTime=0, PayloadPriority=1, EventType=NEW_MESSAGE, TemporaryMessage={Messages={Data=[{Title=Message1, DataToSend={Message=Message1, Data=bZijp8pcSKwnbRhnm2khEUkaonv3XpZ6PnS0IRP+ab4p1ub2Cmuuylga6z8RASwPWbTbxbZH5ickk4HcWKeEM1Qq8PHnmAEj/VmnLmhP9UbITDrFbzWaeUcAhdWuFzD1QKfp4lmjsFl6LeswN6x+tgFi+mimURK9EaKuKiQKnP3sZHfw6Bk2o/jH4ik8hoKetO2GRxNOKs/8H9NYUS4hLP/RP3vJttUtHo3fiJd3WwWcxT+w+YZneDFmc28ECjpoJ0ZzgV7NRJlhKBcOMl7V6eV2Rax7yNC2k7zyHH/ylkhkq74heWAtde/S43S6WLTMbtc2+t1MgHNOevsxEyI/ghBE1wJzgBIdvlhFX4M1jam+AYHyXhF7CNC/Hr/i+dpw3/SogNQwIidW8FZ5EWwBJP2nYv0eZ0owK/8zwEs3FZWF9fRUeUGn7VwFNFDsf4P1FUbeWcwHA847iHBuUF2qDVtLNgQtyejGKxhn3+a8yoKyU96cpteOmGmbZPuyU3xRk401gkk4a0noS3SOjWO8B0OKyq782K3gRUoVssPV8Hg=, ReqNr=9.0000001E7}}]}, Playlists={Data=[{Title=TemporaryPlaylist, DataToSend={Message=Temporary Playlist, Data=G7Prf2ZRUh+MSBNTjLH2BT7nNB1o8zXgz/mn9leZ7Q4UoZH1MzMrJj3ISvdkQZNAbVCnYQjh5XhOnUr1kGrZpWJ/lHUyJi1coc7fdMxe0fkgBJk9SySvaFN/TvFQaUlv2wRWIoPVXmLVhukFgzLsDeVWmuzNaWZXTQCemIzrK6SaCJ+FfFAzgxG2Z7QxWkVx, ReqNr=9.0000002E7}}]}, ShowCommands={Data=[{DataToSend={Type=ShowPlaylistRequest, ShowAll=[{CallWord=t2212161322}]}}]}}}, Message=Success}, message, 108]
Sending to Producer : {"Status":200,"Response":{"PayloadId":1415,"PayloadExpiryDateTime":0,"PayloadPriority":1,"EventType":"NEW_MESSAGE","TemporaryMessage":{"Messages":{"Data":[{"Title":"Message1","DataToSend":{"Message":"Message1","Data":"bZijp8pcSKwnbRhnm2khEUkaonv3XpZ6PnS0IRP+ab4p1ub2Cmuuylga6z8RASwPWbTbxbZH5ickk4HcWKeEM1Qq8PHnmAEj/VmnLmhP9UbITDrFbzWaeUcAhdWuFzD1QKfp4lmjsFl6LeswN6x+tgFi+mimURK9EaKuKiQKnP3sZHfw6Bk2o/jH4ik8hoKetO2GRxNOKs/8H9NYUS4hLP/RP3vJttUtHo3fiJd3WwWcxT+w+YZneDFmc28ECjpoJ0ZzgV7NRJlhKBcOMl7V6eV2Rax7yNC2k7zyHH/ylkhkq74heWAtde/S43S6WLTMbtc2+t1MgHNOevsxEyI/ghBE1wJzgBIdvlhFX4M1jam+AYHyXhF7CNC/Hr/i+dpw3/SogNQwIidW8FZ5EWwBJP2nYv0eZ0owK/8zwEs3FZWF9fRUeUGn7VwFNFDsf4P1FUbeWcwHA847iHBuUF2qDVtLNgQtyejGKxhn3+a8yoKyU96cpteOmGmbZPuyU3xRk401gkk4a0noS3SOjWO8B0OKyq782K3gRUoVssPV8Hg=","ReqNr":9.0000001E7}}]},"Playlists":{"Data":[{"Title":"TemporaryPlaylist","DataToSend":{"Message":"Temporary Playlist","Data":"G7Prf2ZRUh+MSBNTjLH2BT7nNB1o8zXgz/mn9leZ7Q4UoZH1MzMrJj3ISvdkQZNAbVCnYQjh5XhOnUr1kGrZpWJ/lHUyJi1coc7fdMxe0fkgBJk9SySvaFN/TvFQaUlv2wRWIoPVXmLVhukFgzLsDeVWmuzNaWZXTQCemIzrK6SaCJ+FfFAzgxG2Z7QxWkVx","ReqNr":9.0000002E7}}]},"ShowCommands":{"Data":[{"DataToSend":{"Type":"ShowPlaylistRequest","ShowAll":[{"CallWord":"t2212161322"}]}}]}}},"Message":"Success"}
Exit: SocketService.send() with result = null
Received from Kafka : {"Status":200,"Response":{"PayloadId":1415,"PayloadExpiryDateTime":0,"PayloadPriority":1,"EventType":"NEW_MESSAGE","TemporaryMessage":{"Messages":{"Data":[{"Title":"Message1","DataToSend":{"Message":"Message1","Data":"bZijp8pcSKwnbRhnm2khEUkaonv3XpZ6PnS0IRP+ab4p1ub2Cmuuylga6z8RASwPWbTbxbZH5ickk4HcWKeEM1Qq8PHnmAEj/VmnLmhP9UbITDrFbzWaeUcAhdWuFzD1QKfp4lmjsFl6LeswN6x+tgFi+mimURK9EaKuKiQKnP3sZHfw6Bk2o/jH4ik8hoKetO2GRxNOKs/8H9NYUS4hLP/RP3vJttUtHo3fiJd3WwWcxT+w+YZneDFmc28ECjpoJ0ZzgV7NRJlhKBcOMl7V6eV2Rax7yNC2k7zyHH/ylkhkq74heWAtde/S43S6WLTMbtc2+t1MgHNOevsxEyI/ghBE1wJzgBIdvlhFX4M1jam+AYHyXhF7CNC/Hr/i+dpw3/SogNQwIidW8FZ5EWwBJP2nYv0eZ0owK/8zwEs3FZWF9fRUeUGn7VwFNFDsf4P1FUbeWcwHA847iHBuUF2qDVtLNgQtyejGKxhn3+a8yoKyU96cpteOmGmbZPuyU3xRk401gkk4a0noS3SOjWO8B0OKyq782K3gRUoVssPV8Hg=","ReqNr":9.0000001E7}}]},"Playlists":{"Data":[{"Title":"TemporaryPlaylist","DataToSend":{"Message":"Temporary Playlist","Data":"G7Prf2ZRUh+MSBNTjLH2BT7nNB1o8zXgz/mn9leZ7Q4UoZH1MzMrJj3ISvdkQZNAbVCnYQjh5XhOnUr1kGrZpWJ/lHUyJi1coc7fdMxe0fkgBJk9SySvaFN/TvFQaUlv2wRWIoPVXmLVhukFgzLsDeVWmuzNaWZXTQCemIzrK6SaCJ+FfFAzgxG2Z7QxWkVx","ReqNr":9.0000002E7}}]},"ShowCommands":{"Data":[{"DataToSend":{"Type":"ShowPlaylistRequest","ShowAll":[{"CallWord":"t2212161322"}]}}]}}},"Message":"Success"}
Sending to Socket Client.......
But the front-end never gets the message.
Below are some code snippets:
Channel Interceptor:
public class SocketInterceptor implements ChannelInterceptor {
#Autowired
SocketSessionService socketSessionService;
private final Logger log = LoggerFactory.getLogger(SocketInterceptor.class);
#Override
public Message<?> preSend(Message<?> message, MessageChannel channel) {
MessageHeaders headers = message.getHeaders();
StompHeaderAccessor accessor = StompHeaderAccessor.wrap(message);
MultiValueMap<String, String> multiValueMap = headers.get(StompHeaderAccessor.NATIVE_HEADERS,MultiValueMap.class);
if (multiValueMap !=null) {
if (multiValueMap.containsKey("destination")) {
String destination = multiValueMap.get("destination").get(0);
log.info("Destination in interceptor : " + destination);
}
}
if (accessor.getCommand()!=null) {
log.info("Client Method : " + accessor.getCommand().name());
if (accessor.getCommand().equals(StompCommand.CONNECT) || accessor.getCommand().equals(StompCommand.SUBSCRIBE) || accessor.getCommand().equals(StompCommand.SEND)) {
if (!multiValueMap.containsKey("access-token")) {
log.error("Token Error " + "No Token was provided in + " + accessor.getCommand().name());
throw new MessagingException("Forbidden: No Token was provided in header for Socket Command: " + accessor.getCommand().name());
}
String token = multiValueMap.get("access-token").get(0);
if (JwtTokenUtil.verifyToken(JwtTokenUtil.decrypt(token))) {
log.info("Token Verified Successfully >>>>>>>>>>>>>>>>>>>> ");
Claims claims = JwtTokenUtil.verifyJwtToken(JwtTokenUtil.decrypt(token));
accessor.setHeader("access-token", token);
Principal principal = new UsernamePasswordAuthenticationToken(claims.getId(), null, null);
accessor.setUser(principal);
UsernamePasswordAuthenticationToken usernamePasswordAuthenticationToken = new UsernamePasswordAuthenticationToken(
claims.getId(), null, null);
SecurityContextHolder.getContext().setAuthentication(usernamePasswordAuthenticationToken);
if (accessor.getCommand().equals(StompCommand.SUBSCRIBE)){
log.info("Storing Socket Session on Socket Command SUBSCRIBE ");
SocketSessionModel model=new SocketSessionModel();
model.setSessionId(accessor.getSessionId());
model.setDeviceId(multiValueMap.get("destination").get(0));
model.setAccessToken(token);
model.setCurrentStatus(SocketSessionStatus.ONLINE);
model.setReportedOnlineAt(ZonedDateTime.now().with(ZoneOffset.UTC));
model.setReportedOfflineAt(null);
socketSessionService.onSubscribe(model);
}
ObjectMapper objectMapper= new ObjectMapper();
try {
log.info("Returning back the frame : " + objectMapper.writeValueAsString(message.getPayload()));
} catch (Exception e){
e.printStackTrace();
}
return new GenericMessage<>(message.getPayload(), accessor.getMessageHeaders());
}
throw new MessagingException("Token either Expired or Invalid");
}
}
return message;
}
}
WebSocket Config :
#Configuration
#EnableWebSocketMessageBroker
public class WebSocketConfig implements WebSocketMessageBrokerConfigurer {
Logger log= LoggerFactory.getLogger(WebSocketConfig.class);
#Autowired
SocketSessionService socketSessionService;
#Override
public void registerStompEndpoints(StompEndpointRegistry stompEndpointRegistry) {
stompEndpointRegistry.addEndpoint("/ws")
.setAllowedOrigins("*")
.withSockJS();
}
#Override
public void configureMessageBroker(MessageBrokerRegistry config) {
config.enableSimpleBroker("/user")
.setTaskScheduler(customTaskScheduler())
.setHeartbeatValue(new long[]{25000,10000});
config.setApplicationDestinationPrefixes("/app");
config.setUserDestinationPrefix("/user");
}
#Override
public void configureClientInboundChannel(ChannelRegistration registration) {
registration.interceptors(socketInterceptor());
}
#Bean
public SocketInterceptor socketInterceptor() {
return new SocketInterceptor();
}
#Bean
public TaskScheduler customTaskScheduler(){
return new ThreadPoolTaskScheduler();
}
#EventListener(SessionDisconnectEvent.class)
public void handleWsDisconnectListener(SessionDisconnectEvent event) {
Optional<String> sessionId= getSessionIdFromEvent(event);
sessionId.ifPresent(s -> socketSessionService.onDisconnect(s));
}
private Optional<String> getSessionIdFromEvent(AbstractSubProtocolEvent event) {
String sessionId = null;
System.out.println("Session Id in event : " + (event.getMessage().getHeaders()).get("simpSessionId"));
Object sessionIdAsObject = (event.getMessage().getHeaders()).get("simpSessionId");
if (nonNull(sessionIdAsObject) && sessionIdAsObject.getClass().equals(String.class)) {
sessionId = (String) sessionIdAsObject;
}
return Optional.ofNullable(StringUtils.isEmpty(sessionId) ? null : sessionId);
}
}
Kafka Producer Config :
#Service
public class KafkaProducerConfig {
#Autowired
private KafkaTemplate<String, String> kafkaTemplate;
#Async
public void send(String data, String kafkaTopic, String key) {
kafkaTemplate.send(kafkaTopic, key, data);
}
#Async
public void send(String topic, String data ){
kafkaTemplate.send(topic, data);
}
}
KafkaConsumer Config :
#EnableKafka
#Configuration
public class KafkaConsumerConfig {
#Value("${spring.kafka.bootstrap-servers}")
private String bootstrapServers;
public ConsumerFactory<String, String> consumerFactory(String groupId) {
Map<String, Object> props = new HashMap<>();
props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapServers);
props.put(ConsumerConfig.GROUP_ID_CONFIG, groupId);
props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
return new DefaultKafkaConsumerFactory<>(props);
}
#Bean
public ConcurrentKafkaListenerContainerFactory<String, String> rawKafkaListenerContainerFactory() {
ConcurrentKafkaListenerContainerFactory<String, String> factory = new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(consumerFactory("something"));
return factory;
}
}
Kafka Listener :
This is the last step where socket emits the messages to clients. The last line of log mentioned here is executed.
#Component
public class KafkaListenerForSocket {
private final Logger log= LoggerFactory.getLogger(KafkaListenerForSocket.class);
#Autowired
SimpMessagingTemplate messagingTemplate;
#KafkaListener(topics = KafkaTopicConstants.MESSAGE, containerFactory = "rawKafkaListenerContainerFactory")
public void listenToMessages(ConsumerRecord<String, String> consumerRecord) throws IOException {
ObjectMapper objectMapper=new ObjectMapper();
String destination="/queue/message";
BaseSocketResponse response=objectMapper.readValue(consumerRecord.value(), BaseSocketResponse.class);
messagingTemplate.convertAndSendToUser(consumerRecord.key(),destination,response);
}
public void send(Object data, String key){
log.info("Bypassing Kafka and sending directly");
String destination="/queue/message";
log.info("sending to destination : " + "/user/"+key+destination);
messagingTemplate.convertAndSendToUser(key, destination, data);
}
}
BaseSocketResponse.Java class :
public class BaseSocketResponse<T> {
#JsonProperty(value = "Status", required = true)
private Integer status;
#JsonProperty(value = "Message", required = true)
private String statusMessage = "";
#JsonProperty(value = "Response", required = true)
private T response;
public Integer getStatus() {
return status;
}
public void setStatus(Integer status) {
this.status = status;
}
public String getStatusMessage() {
return statusMessage;
}
public void setStatusMessage(String statusMessage) {
this.statusMessage = statusMessage;
}
public T getResponse() {
return response;
}
public void setResponse(T response) {
this.response = response;
}
}
front-end code :
var stompClient = null;
function setConnected(connected) {
$("#connect").prop("disabled", connected);
$("#disconnect").prop("disabled", !connected);
if (connected) {
$("#conversation").show();
}
else {
$("#conversation").hide();
}
$("#userinfo").html("");
}
function connect() {
let header={"access-token" : $("#token").val()};
var socket = new SockJS('/services/toplight/ws');
stompClient = Stomp.over(socket);
stompClient.connect(header, function (frame) {
if(frame.command == "CONNECTED") {
console.log("Inside frame command : >>>>> " + frame);
setConnected(true);
subscribe(stompClient)
}
}, (error) => {
console.log("Inside Connect error : " + error );
onConnectError(error);
});
}
function subscribe(stompClient){
var token=$("#token").val();
var header={"access-token": token};
console.log("Subscribing to Destination : " + '/user/'+$("#name").val()+'/queue/message');
stompClient.subscribe('/user/'+$("#name").val()+'/queue/message' , greeting, header);
var greeting=function(message){
var body=JSON.parse(message.body);
if(body.Response.EventType=="NEW_MESSAGE"){
showNewMessage(response);
}
if(body.Response.EventType=="DEVICE_UPDATE_RESPONSE"){
showDeviceUpdate(response);
}
if(body.Response.EventType=="MESSAGE_UPDATE_RESPONSE"){
showMessageUpdate(response);
}
};
}
function onConnectError(error){
console.log("Error : >>>>> " + error);
$("#greetings").append("<h3><tr><td>" + error + "</h3>");
}
I have problem with #DeleteMapping.
Situation is like below.
If I request to /v1/cache/{cacheEntry} with method DELETE,
It respond with 404, but body was empty. no message, no spring default json 404 response message.
If i request to /v1/cache/{cacheEntry} with method POST,
It respond with 405 and body was below. (This action is correct, not a bug.)
If I change #DeleteMapping to #PostMapping, and request /v1/cache/{cacheEntry} with method POST, It respond success with code 200.
{
"timestamp": 1643348039913,
"status": 405,
"error": "Method Not Allowed",
"message": "",
"path": "/v1/cache/{cacheEntry}"
}
// Controller
#Slf4j
#RestController
#RequestMapping("/v1/cache")
#RequiredArgsConstructor
public class CacheController {
private final CacheService cacheService;
#PostMapping("/{cacheEntry}")
public CacheClearResponse clearCacheEntry(#PathVariable("cacheEntry") CacheChannels cacheEntry) {
try {
log.info("Cache entry :: " + cacheEntry);
cacheService.evictCacheEntry(cacheEntry);
return CacheClearResponse.builder()
.result(
RequestResult.builder()
.code(9200)
.message("SUCCESS")
.build()
)
.common(
Common.builder().build()
)
.date(LocalDateTime.now())
.build();
} catch (Exception e) {
e.printStackTrace();
StringWriter sw = new StringWriter();
e.printStackTrace(new PrintWriter(sw));
return CacheClearResponse.builder()
.result(
RequestResult.builder()
.code(9999)
.message(sw.toString())
.build()
)
.common(
Common.builder().build()
)
.date(LocalDateTime.now())
.build();
}
}
}
}
// CacheService
#Service
#RequiredArgsConstructor
public class CacheService {
private final CacheManager cacheManager;
public void evictCacheEntry(CacheChannels cacheEntry) {
Cache cache = cacheManager.getCache(cacheEntry.getCacheName());
if (cache != null) {
cache.clear();
}
}
public void evictCache(CacheChannels cacheEntry, String cacheKey) {
Cache cache = cacheManager.getCache(cacheEntry.getCacheName());
if (cache != null) {
cache.evict(cacheKey);
}
}
}
// Enum
#Getter
#AllArgsConstructor
public enum CacheChannels {
CACHE_TEN_MIN(Names.CACHE_TEN_MIN, Duration.ofMinutes(10)),
CACHE_HALF_HR(Names.CACHE_HALF_HR, Duration.ofMinutes(30)),
CACHE_ONE_HR(Names.CACHE_ONE_HR, Duration.ofHours(1)),
CACHE_THREE_HR(Names.CACHE_THREE_HR, Duration.ofHours(3)),
CACHE_SIX_HR(Names.CACHE_SIX_HR, Duration.ofHours(6)),
CACHE_ONE_DAY(Names.CACHE_ONE_DAY, Duration.ofDays(1));
private final String cacheName;
private final Duration cacheTTL;
public static CacheChannels from(String value) {
return Arrays.stream(values())
.filter(cacheChannel -> cacheChannel.cacheName.equalsIgnoreCase(value))
.findAny()
.orElse(null);
}
public static class Names {
public static final String CACHE_TEN_MIN = "cache10Minutes";
public static final String CACHE_HALF_HR = "cache30Minutes";
public static final String CACHE_ONE_HR = "cache1Hour";
public static final String CACHE_THREE_HR = "cache3Hours";
public static final String CACHE_SIX_HR = "cache6Hours";
public static final String CACHE_ONE_DAY = "cache1Day";
}
}
// Converter
#Slf4j
public class StringToCacheChannelConverter implements Converter<String, CacheChannels> {
#Override
public CacheChannels convert(String source) {
log.info("Convert Target: " + source);
return CacheChannels.from(source);
}
}
// Security Config
#Configuration
#EnableWebSecurity
#Order(1)
public class APISecurityConfig extends WebSecurityConfigurerAdapter {
#Value("${spring.security.auth-token-header-name:Authorization}")
private String apiKeyHeader;
#Value("${spring.security.secret}")
private String privateApiKey;
#Override
protected void configure(HttpSecurity http) throws Exception {
APIKeyAuthFilter filter = new APIKeyAuthFilter(apiKeyHeader);
filter.setAuthenticationManager(new AuthenticationManager() {
#Override
public Authentication authenticate(Authentication authentication)
throws AuthenticationException {
String requestedApiKey = (String) authentication.getPrincipal();
if (!privateApiKey.equals(requestedApiKey)) {
throw new BadCredentialsException("The API Key was not found or not the expected value");
}
authentication.setAuthenticated(true);
return authentication;
}
});
http
.csrf().disable()
.sessionManagement()
.sessionCreationPolicy(SessionCreationPolicy.STATELESS)
.and()
.addFilter(filter)
.authorizeRequests()
.antMatchers("/v1/cache/**")
.authenticated();
}
}
// Filter
#Slf4j
public class APIKeyAuthFilter extends AbstractPreAuthenticatedProcessingFilter {
private String apiKeyHeader;
public APIKeyAuthFilter(String apiKeyHeader) {
this.apiKeyHeader = apiKeyHeader;
}
#Override
protected Object getPreAuthenticatedPrincipal(HttpServletRequest httpServletRequest) {
log.info("Check authenticated.");
return httpServletRequest.getHeader(apiKeyHeader);
}
#Override
protected Object getPreAuthenticatedCredentials(HttpServletRequest httpServletRequest) {
return "N/A";
}
}
// Web Config
#Configuration
public class WebConfig implements WebMvcConfigurer {
#Override
public void addFormatters(FormatterRegistry registry) {
registry.addConverter(new StringToCacheChannelConverter());
}
#Bean
public HiddenHttpMethodFilter hiddenHttpMethodFilter() {
return new HiddenHttpMethodFilter();
}
}
This can be expected the controller was loaded, endpoint was mapped.
I tried change #DeleteMapping to #PostMapping and it was successfully respond against to POST request.
What am I missing?
I found reason why received 404 without any messages.
My tomcat is on remote server. It configured with security-constraint and disable DELETE method for all enpoints.
I just comment out it and It work properly with delete method.
I am sensibly new to project reactor and web flux, I want to rewrite my current traditional, blocking security filter to the reactive one, namely:
the current filter looks like this:
#Component
#Slf4j
public class WhitelistingFilter extends OncePerRequestFilter {
private static final String SECURITY_PROPERTIES = "security.properties";
private final Properties securityProperties = readConfigurationFile(SECURITY_PROPERTIES);
private final String whitelistingEnabled = securityProperties.getProperty("whitelisting.enabled", FALSE.toString());
private final RedisTemplate<String, Object> whitelistingRedisTemplate;
private final AwsCognitoIdTokenProcessor awsCognitoIdTokenProcessor;
public WhitelistingFilter(
#Qualifier("whitelistingRedisTemplate")
RedisTemplate<String, Object> whitelistingRedisTemplate,
AwsCognitoIdTokenProcessor awsCognitoIdTokenProcessor) {
this.whitelistingRedisTemplate = whitelistingRedisTemplate;
this.awsCognitoIdTokenProcessor = awsCognitoIdTokenProcessor;
}
#Override
protected boolean shouldNotFilter(#NonNull HttpServletRequest request) {
AntPathMatcher pathMatcher = new AntPathMatcher();
return Stream.of(USER_LOGIN_URL, ADMIN_LOGIN_URL, SIGNUP_BY_ADMIN_URL, SIGNUP_URL, LOGOUT_URL)
.anyMatch(p -> pathMatcher.match(p, request.getServletPath())) || whitelistingDisabled();
}
private boolean whitelistingDisabled() {
return FALSE.toString().equalsIgnoreCase(whitelistingEnabled);
}
#Override
protected void doFilterInternal(#NonNull HttpServletRequest httpServletRequest, #NonNull HttpServletResponse httpServletResponse, #NonNull FilterChain filterChain) {
try {
Authentication authentication = awsCognitoIdTokenProcessor.getAuthentication(httpServletRequest);
Optional<String> username = Optional.ofNullable(authentication.getName());
if (username.isPresent() && usernameWhitelisted(username.get())) {
log.info("User with username: {} is present in whitelisting", username.get());
filterChain.doFilter(httpServletRequest, httpServletResponse);
} else {
httpServletResponse.setStatus(HttpServletResponse.SC_UNAUTHORIZED);
log.error("Username: {} not whitelisted or empty", username.orElse(""));
}
} catch (Exception e) {
logger.error("Error occurred while checking user in redis whitelisting", e);
SecurityContextHolder.clearContext();
}
}
private boolean usernameWhitelisted(String username) {
return Boolean.TRUE.equals(whitelistingRedisTemplate.hasKey(WHITELISTING_PREFIX + username));
}
}
New, incomplete, reactive approach class looks like this:
#Component
#Slf4j
public class WhitelistingFilter implements WebFilter {
private static final String SECURITY_PROPERTIES = "security.properties";
public final List<String> whitelistedUrls =
List.of(USER_LOGIN_URL, ADMIN_LOGIN_URL, SIGNUP_BY_ADMIN_URL, SIGNUP_URL, LOGOUT_URL);
private final Properties securityProperties = readConfigurationFile(SECURITY_PROPERTIES);
private final String whitelistingEnabled = securityProperties.getProperty("whitelisting.enabled", FALSE.toString());
private final ReactiveRedisOperations<String, Object> whitelistingRedisTemplate;
private final AuthenticationManager authenticationManager;
public WhitelistingFilter(
#Qualifier("reactiveWhitelistingRedisTemplate")
ReactiveRedisOperations<String, Object> whitelistingRedisTemplate,
AuthenticationManager authenticationManager) {
this.whitelistingRedisTemplate = whitelistingRedisTemplate;
this.authenticationManager = authenticationManager;
}
#Override
public Mono<Void> filter(ServerWebExchange exchange, WebFilterChain chain) {
Mono<String> username =
ReactiveSecurityContextHolder.getContext()
.map(SecurityContext::getAuthentication)
.map(Authentication::getName);
//logic here
}
private Mono<Boolean> whitelistingDisabled() {
return Mono.just(FALSE.toString().equalsIgnoreCase(whitelistingEnabled));
}
private Mono<Boolean> usernameWhitelisted(Mono<String> username) {
return whitelistingRedisTemplate.hasKey(WHITELISTING_PREFIX + username);
}
}
I changed the usernameWhitelisted() and whitelistingDisabled() methods to return Mono's but I cannot figure out how to verify if the username is whitelisted and whitelisting enabled in reactive approach. I tried to do sth
username.flatMap(u -> {
if(two conditions here)
})
but with this approach, I am providing Mono to the if statment which is contradictory with Java semantic. I will be grateful for suggestions on how to rewrite the code and make it works in a reactive approach.
Having a reactive stream or pipe doesn't mean that everything needs to be reactive, you can just leave the mono out of the boolean operators
As you return a Mono of type void and since you need (?) logging, I guess you could just throw a custom exception in here as well and catch it on top op the stream/pipe by using the doOnError and in there check for the exception type and have according logic (pass a msg to the exception should you need one)
ReactiveSecurityContextHolder.getContext()
.map(SecurityContext::getAuthentication)
.map(Authentication::getName)
.doOnNext(this::doBla)
public void doBla(String username) {
if (!whitelistingDisabled() || !usernameWhitelisted(username)) {
throw new TypeException(ms);
}
}
FYI doOnNext is only being executed if you reach that part, its kind of like a peek in a java stream.
The doOnError its functionality could also be extracted, a stream is easier to read to keep it simple and not add too much into it
myStream
....otherOperators
.filter(bla)
...
.doOnError(e -> {
if ( e instanceOf TypeException) {
// do stuff
}
}
I'm making an ajax call to a method that returns a list of object, if something happens while getting the data in a try-catch block I have a response.setStatus(400) to then show the error in the front-end, also there I'm returning null, there is where I'm getting the SonarLint notification. Now if I change that to an empty collection then I get below error:
getWriter() has already been called for this response
I think the above is because I'm returning the empty collection and the http response status 400. If I leave it null then all works fine, just that SonarLint notification.
#GetMapping("/runquery")
#ResponseBody
public List<Map<String, Object>> runQuery(#RequestParam(name = "queryId") String queryId, #RequestParam(name = "formData") String formData, HttpServletResponse response) throws IOException {
(...)
try {
queryResult = namedParameterJdbcTemplateHive.queryForList(query, paramSource);
for (Map<String, Object> map : queryResult) {
Map<String, Object> newMap = new HashMap<>();
for (Map.Entry<String, Object> entry : map.entrySet()) {
String key = entry.getKey();
Object value = entry.getValue();
if (key.contains(".")) {
key = key.replace(".", "_");
newMap.put(key, value);
} else {
newMap.put(key, value);
}
}
queryResultFinal.add(newMap);
}
} catch (Exception e) {
response.setStatus(400);
response.getWriter().write(e.getMessage());
return null; <-- SonarLint notification
}
return queryResultFinal;
}
Any idea on how to fix this notification?
I would recommend not catching the exception in this method, but instead throw it, and use an exception handler method in your controller to handle it. In that case you will never return null from the method, and Sonar will have nothing to complain about. It will also mean that you are using Spring the way it is designed to be used.
For example, something like the following:
#ExceptionHandler
#ResponseStatus(HttpStatus.BAD_REQUEST)
public void handleException(Exception e) {
log.error("Exception during request", e);
}
or the direct equivalent of your current handling:
#ExceptionHandler
public ResponseEntity<?> handleException(Exception e) {
return ResponseEntity.badRequest().body(e.getMessage()).build();
}
You can remove the HttpServletResponse response parameter from your normal method after switching to an exception handler.
I would recommend you to create a GenericReponse that wraps all of your responses, it's pretty good for front-end also because you're facing with a fixed template.
So via this solution, you can wrap up any object you want and send it to the response.
I coded the scenario like this:
1- Create a GenericResponse Class
#JsonInclude(JsonInclude.Include.NON_NULL)
#JsonIgnoreProperties(ignoreUnknown = true)
public class GenericResponse {
private Boolean error;
private List<ErrorPayload> errorPayload;
private Object payload;
public GenericResponse(Boolean error) {
this.error = error;
}
public static GenericResponse ok() {
return new GenericResponse(false);
}
public GenericResponse payload(Serializable o) {
this.payload = o;
return this;
}
//Getters and Setters and other Constructors
2-Create ErrorPayload Class
#JsonInclude(JsonInclude.Include.NON_NULL)
#JsonIgnoreProperties(ignoreUnknown = true)
public class ErrorPayload {
private String errorMessage;
private String errorType;
//Getters and Setters and Constructors
}
3-Create ExceptionConverter Service (Used when we have exception)
#Service
public class ExceptionConverterService {
public GenericResponse convert(Exception x) {
GenericResponse genericResponse = new GenericResponse();
genericResponse.setError(true);
String exceptionType = x.getClass().getSimpleName();
String exceptionMessage = x.getClass().getSimpleName();
genericResponse.setErrorPayload(Collections.singletonList(new ErrorPayload(exceptionType, exceptionMessage)));
return genericResponse;
}
}
4-Change Your scenario with GenericResponse
All you need to do is:
Create aforementioned classes (Copy the code that I wrote in 1, 2 and 3)
Change your response form List<Map<String, Object>> to GenericResponse
Wrap your return types into GenericResponse
I changed your code as follows (Just change 3 lines)
#RestController
public class TestController {
#Autowired
private ExceptionConverterService exceptionConverter;
#GetMapping("/runquery")
#ResponseBody
//Changed (Change Return type to GenericResponse )
public GenericResponse runQuery(#RequestParam(name = "queryId") String queryId, #RequestParam(name = "formData") String formData, HttpServletResponse response) throws IOException {
try {
//Your code
}
} catch (Exception e) {
//Changed (Create GenericResponse for Exception)
GenericResponse genericResponse = exceptionConverter.convert(e);
return genericResponse;
}
//Changed (Create GenericResponse for main result)
return GenericResponse.ok().payload(queryResultFinal);
}
}
Examples for two scenarios (first, without exception and the second with exception)
Sample 1
Controller with GenericResponse (We have no exception in this sample)
#RestController
public class TestController {
#GetMapping(value = "/getNameAndFamily")
public GenericResponse getNameAndFamily() {
Map<String, String> person = new HashMap<>();
person.put("name", "foo");
person.put("family", "bar");
return GenericResponse.ok().payload((Serializable) person);
}
}
The result is like as follows:
{
"error": false,
"payload": {
"name": "foo",
"family": "bar"
}
}
Sample 2
controller with GenericResponse when we have Exception in business
#RestController
public class TestController {
#Autowired
private ExceptionConverterService exceptionConverter;
#GetMapping(value = "/getNameAndFamily")
public GenericResponse getNameAndFamily() {
try {
//Create Fake Exception
int i = 1 / 0;
return GenericResponse.ok();
} catch (Exception e) {
//Handle Exception
GenericResponse genericResponse = exceptionConverter.convert(e);
return GenericResponse.ok().payload((Serializable) genericResponse);
}
}
}
The result is as follows:
{
"error": true,
"errorPayload": [
{
"errorType": "ArithmeticException"
}
]
}
I followed this and created Executor, Callable and ExecutorConfig exactly as described in the answer. Now I started getting HttpServletRequest object in AOP code but the object doesn't contain anything. for example request.getRequestURI() is giving NULL.
In my AOP code I just need to read the Throwable and HttpServletRequest objects to store error information and some important request headers along with the URI in a table.
Here is my AOP code -
#Aspect
#Component
public class ErrorAspect {
private static final String EXCEPTION_EXECUTION_PATH = "execution(* com.myproject.*.service.impl.*.*(..))";
#Autowired
private ErrorHelper errorHelper;
#Pointcut( EXCEPTION_EXECUTION_PATH)
public void atExecutionExcpetion() {
}
#AfterThrowing( value = "atExecutionExcpetion()", throwing = "error")
public void storeErrorAfterThrowing( Throwable error) {
errorHelper.saveError(error);
}
}
And saveError() method in ErrorHelper is -
public void saveError( Throwable error) {
HttpServletRequest request = null;
if (RequestContextHolder.getRequestAttributes() != null) {
request = ((ServletRequestAttributes) RequestContextHolder.currentRequestAttributes()).getRequest();
}
Error error = prepareError(request, error);
CompletableFuture.runAsync(() -> insertError(error));
}
private Error prepareError( HttpServletRequest request, Throwable error) {
Error error = new Error();
if (request == null) {
String process = Constants.AUTO_JOB + LocalDateTime.now(ZoneId.of(PST_ZONE_ID)).toString().replaceAll("-", "");
error.setProcessType(Constants.AUTO_JOB);
error.setApplicationId(process);
error.setSessionId(process);
error.setUri(NA);
} else {
error.setProcessType(request.getHeader(Constants.PROCESS_ID));
error.setApplicationId(request.getHeader(Constants.APPLICATION_ID));
error.setSessionId(request.getHeader(Constants.SESSION_ID));
error.setUri(request.getRequestURI());
}
error.setEventDateTime(Instant.now());
error.setErrorType(getErrorType(error));
error.setErrorMessage(getErrorMessage(error));
return error;
}
This works perfectly fine with synchronous calls. But for #Async calls there is no header/uri information in the request object.
Created a decorator and copy required request attribute to MDC. Here is the decorator code -
public class ContextAwareExecutorDecorator implements Executor, TaskExecutor {
private final Executor executor;
public ContextAwareExecutorDecorator( Executor executor) {
this.executor = executor;
}
#Override
public void execute( Runnable command) {
Runnable ctxAwareCommand = decorateContextAware(command);
executor.execute(ctxAwareCommand);
}
private Runnable decorateContextAware( Runnable command) {
RequestAttributes originalRequestContext = RequestContextHolder.currentRequestAttributes();
if (originalRequestContext != null) {
HttpServletRequest request = ((ServletRequestAttributes) originalRequestContext).getRequest();
copyRequestToMDC(request);
}
final Map<String, String> originalContextCopy = MDC.getCopyOfContextMap();
return () -> {
try {
if (originalRequestContext != null) {
RequestContextHolder.setRequestAttributes(originalRequestContext);
}
MDC.setContextMap(originalContextCopy);
command.run();
} finally {
MDC.clear();
RequestContextHolder.resetRequestAttributes();
}
};
}
private void copyRequestToMDC( HttpServletRequest request) {
if (request != null) {
MDC.put("requestURI", request.getRequestURI());
// Set other required attributes
}
}
}
Here is the Executor config -
#Configuration
public class ExecutorConfig extends AsyncConfigurerSupport {
#Override
#Bean( "asyncTaskExecutor")
public Executor getAsyncExecutor() {
ThreadPoolTaskExecutor executor = new ThreadPoolTaskExecutor();
executor.setThreadNamePrefix("contextAwareExecutor-");
executor.initialize();
return new ContextAwareExecutorDecorator(executor);
}
#Override
#Bean
public AsyncUncaughtExceptionHandler getAsyncUncaughtExceptionHandler() {
return new CustomAsyncExceptionHandler();
}
}
Now in AOP code I am able to retrieve attributes from MDC.
error.setUri(MDC.get("requestURI"));