How to create mock VaadinSession for integration tests? - java

My Issue:
This is my first time doing tests for Vaadin UI and I am also fairly new to Unit tests in general. My issue is that I can't do anything with my UI components Views because there is no VaadinSession which handles the UI beans. A VaadinSession is never created when using #SpringBootTest. I was able to create tests for my backend since Spring handles those beans, but I can't seem to figure out a way to get Vaadin to start up a session so I can then access the session and do different integration and unit tests.
What I've Tried
TestBench: The Vaadin testbench seemed like a very good option but the issue that I faced was that it doesn't seem to open a VaadinSession whenever I open a ChromeDriver() that goes to the website on my localhost.
Karibu Library: This library seemed like a very good option, but there was one issue, which was that it opens individual UI components that are instantiated, however a couple of my UI Components Classes, uses dependency injection to inject backend services. I cannot instantiate these classes because of the dependeny injection.
The UI Component that I need to access through the VaadinSession.
#Component
#UIScope
#CssImport("./styles/current-info-styles.css")
public class CurrentDayView extends VerticalLayout implements Updatable {
private static final long serialVersionUID = 1L;
//Some code here
#Autowired
public CurrentDayView(NowcastWeatherService nowcastWeatherService, GeoLocationService geoLocationService) {
this.nowcastWeatherService = nowcastWeatherService;
this.geoLocationService = geoLocationService;
//Some Code here
}
//Some code here
My Testbench approach
#RunWith(SpringRunner.class)
#SpringBootTest
public class CurrentDayViewTest extends TestBenchTestCase {
#Test
public void fakeTest() {
Assert.assertTrue(true);
}
#Before
public void startUp() {
System.setProperty("webdriver.chrome.driver", "src/main/resources/drivers/chromedriver.exe");
setDriver(new ChromeDriver());
getDriver().get("http://localhost:8080/");
populateViewWithInformation();
}
#After
public void tearDown() {
getDriver().quit();
}
private void populateViewWithInformation() {
CurrentDayView currentDayView = (CurrentDayView) VaadinSession.getCurrent().getAttribute("current-day-view");
//This is where I get an error because VaadinSession.getCurrent() is null
}
My Final Question:
Does anyone have any idea on how I could have a VaadinSession created or atleast get spring to keep track for Vaadin UI components? If this wasn't clear then please feel free to ask more clarifications relating to my question.

I suggest you give Karibu another shot, it's great for doing these kinds of tests that don't need the app to be running.
Take a look at the Karibu V14 Spring demo project. Pay attention to what Karibu dependency is used. The ApplicationTest#listOrders test contains navigation to a view with autowired dependencies.
The issue with your TestBench test is that TestBench is used to test against a running application, and the tests run in an entirely different process than the actual application.
So when you open the page with the driver, a Vaadin session is created in the application, but you will not be able to access it in your tests, nor will you be able to access any UI state or views. What it allows you to do, however, is to interact with the application as you would do through the browser (clicking buttons, filling in text fields etc.), and to check that the state in the browser is correct, without knowing anything about the server's internal state.

Generally speaking, when you're writing integration tests with TestBench (which is based on Selenium), what you're doing is directing the browser. You're writing Java code, yes, and the code may even be in the same project as your Vaadin UI code, but it can be executed against any URL and what you're interacting with is the browser's DOM. You're describing what the end-user would do: click a button, write some text in an input field, choose an option from a radio button group. The server-side is a black box. After all, if I submit a post on StackOverflow, I can't check if it gets stored in a database - all I can do is look at what I see after I press the "Post your answer" button. If you really want to nitpick, it doesn't even need to be a Vaadin application you're testing with TestBench, as long as the application behaves like one in the browser.
Secondly, you shouldn't store any Components in the VaadinSession. If you open a Vaadin application in multiple browser tabs, each of those tabs will share the same VaadinSession. A single Component instance should only be used inside one browser tab, where the root component is the current UI.

Related

How to serve dynamically changing photo to client via Spring MVC?

I wrote an MVC application and I wanted to create an option of editing the photo. I serve image as div which background is the image and div has click listener that post suitable requests to application.
Then on the application side works another program that edits photo as client asked and saves it and override a old one. The problem is that until I redeploy an application it serves an old file even though it has changed. I think the problem is because I serve photos from resources folder and it’s cold “static resources” so I should not change it. Another think is that IntelliJ is not refreshing the photo. Do you have any idea what I do wrong or know how should I serve that photos in better way?
So as Prashant mentioned in comment all I had to do was setCachePeriod to 0.
EDIT:
Following are two ways of doing it:
Either set in the application.properties file:
spring.resources.cache-period = 0
Or, add in the ResourceHandler:
#Component
public class WebConfigurations extends WebMvcConfigurerAdapter {
#Override
public void addResourceHandlers(ResourceHandlerRegistry registry) {
registy.addResourceHandler("/static/**")
.addResourceLocations("classpath:/static/")
.setCachePeriod(0);
}
}

Looking for the best way to share an interface between microservices with quarkus

I'm still quite new to microservices and have a few basic architectural questions that I can't get solved right now.
I'm using the Quarkus framework with the standard extensions like quarkus-resteasy and quarkus-rest-client for the realization.
The scenario:
I have an example of a "Persistence" service that I want to externally populate with data via a REST call in a dedicated Maven project.
#Path("/api/persistence")
#Products(MediaType.APPLICATION_JSON)
public class Persistence{
#Inject
EntityManager entityManager;
#POST
#Transactional
public Response create(PostDto postDto) {
Post post = toPostMapper.toResource(postDto);
entityManager.persist(post);
return Response.ok(postDto).status(201).build();
}
}
At the same time I would like to have a microservice DataGenerator which generates the corresponding data and passes it to the Persistence Service.
My problem : API sharing
Both services were created as Maven projects.
According to the tutorials I found the correct way would be to declare an interface (here called PersistenceApi) in the DataGenerator project like this
#Path("/api/persistence")
#Products(MediaType.APPLICATION_JSON)
#RegisterRestClient
public interface PersistenceApi {
#POST
#Transactional
public Response create(PostDto post) ;
}
This interface is then integrated into the DataGenerator service via #Inject, which leads to the following exemplary service.
#RequestScoped
#Path("/api/datagenerator")
#Products("application/json")
#Consumes("application/json")
public class DataGenerator{
#Inject
#RestClient
PersistenceApi persistenceApi
#POST
public void getPostExamplePostToPersistence() {
PostDto post = new PostDto();
post.setTitle("Find me in db in persistence-service")
persistenceApi.create(post);
}
}
I have the PersistenceService running locally on port 8181 and have added the following entry in the application.properties of the DataGenerator project so that the service can be found.
furnace.collection.item.service.PersistenceApi/mp-rest/url=http://localhost:8181
furnace.collection.item.service.PersistenceApi/mp-rest/scope=javax.inject.Singleton
I find it "wrong" to declare the interface in my DataGenerator, because at this point I don't notice when the api provided by the Persistence service changes. Accordingly one could come up with the idea to position the interface in the Persistence service, which is then implemented by my concrete Persistence implementation and leads to the following code.
#Path("/api/persistence")
#Products(MediaType.APPLICATION_JSON)
#RegisterRestClient
public class PersistenceApiImpl implements PersistenceApi {
#Inject
EntityManager entityManager;
#POST
#Transactional
public Response create(PostDto fruit) {
Post post = toPostMapper.toResource(fruit);
entityManager.persist(post);
return Response.ok(fruit).status(201).build();
}
}
In order to use them in my DataGenerator project, I would have to include the Persistence project as a dependency in my DataGenerator project, which sounds like a "monolith with extra steps" to me and therefore feels wrong in terms of "separation of concerns".
I have tried the following approach:
I created another Maven project called PersistenceApi which only contains the corresponding PersistenceApi. This PersistenceApi project was then included as a dependency in both the "Persistence" and "DataGenerator" projects. In the "Persistence"-Project I implement the service from the example above and try to address the corresponding interface in the "DataGenerator"-Project via #Inject.
Unfortunately this does not work. When I'm building the service, I get the message that the required dependency PersistenceApi, which I want to include via #Inject in the DataGenerator service, cannot be injected in the form of an UnsatisfiedResolutionException.
Now my questions:
I don't see what I'm missing here. Could you help me?
Is this kind of API-sharing with dedicated Api projects a viable way or is the "monolith with extra steps" approach really the way to go?
Thank you in advance.
Thats a common problem with microservices. Like in the book "Microservices: Grundlagen flexibler Softwarearchitekturen" by Eberhard Wolff (I saw that you are German too) i follow the idea that microservices should have the same coupling like the teams developing them and like the organization your developing it for(have a look at Conway's law). Therefore services of mostly independent teams should be developed independly and the api changes of one service should not affect another at the time of the update.
If you develop both services in your team then i think you can couple them the way you are doing it because you dont have to work together with other teams and there will be no huge overhead. Note that you will be forced to release both services together. If that is always ok for you then save your time and do it your way, if not have a look at API-Versioning:
I use api versioning so the old api is still reachable under "v1/" and the new one under "v2/". This way the team behind the other microservice has enough time to update their service.
Have a look at Domain-driven Design for different ways of integrating bounded contexts (=services) and the coupling consequences. Without API-Versioning you are forced to a partnership and you need to release together. Maybe you prefer Customer-Supplier or even conformist.
To test compatibility between both services have a look at consumer driven contracts and Pact. You can also generate open api files and track their changes but that will only help to notify people about changes.

Is it possible to activate or deactivate jobs via a configuration file so as to avoid unintentional startups?

I would like to have the possibility to activate or deactivate jobs via:
a configuration file with specific ON/OFF for each job or
a mysql table with specific ON/OFF for each job.
Charging must take place at each change of status: for example, if a job is OFF when the ON setting (state change) the java app will be able to receive the status update.
Thanks for helping me out.
If I understood your question correctly, you are looking for capability to control your jobs from configuration. If yes, then following might be helpful.
You can schedule you jobs using Apache Camel routing.
public class JobExecutorRoute extends RouteBuilder
{
private String script="exec:yourjob.sh";
private String cron="quartz2://group/timer?cron=00+00+010+*+*+?" ;
public void configure() throws Exception {
from(script).autoStartup(true).to(cron);
}
}

JAX-RS: Server-side user interface

I have a JAX-RS REST service which needs to provide some visual server side output when a REST endpoint is invoked. Later on, I might also want to provide a user interface which an administrator will use for interacting with the web service. I could obviously build REST endpoints for this interaction, and have the administrator invoke these (from a different machine or the same machine) using a provided client application with a UI, but I would like to avoid exposing this functionality to the network. In order to make deployment easy, I would also like everything (i.e. the web service and its administrator UI) to be part of the same application.
I have found that my JAX-RS application will throw a HeadlessException if I try to construct a JFrame or any other top level UI element as described here. I have also found that I can avoid this by setting the system property -Djava.awt.headless=false.
I think I may be able to achieve what I need by setting the headless system property as above and defining a singleton startup EJB that will handle all server side UI tasks:
#Singleton
#Startup
public class UiBean {
private JFrame frame;
#PostConstruct
public void init() {
SwingUtilities.invokeLater(() -> {
frame = new JFrame();
// ...setup UI...
});
}
// JAX-RS resources will inject the UiBean and invoke methods as this one when UI updates are needed.
public void updateSomeUiComponent() {
SwingUtilities.invokeLater(() -> { ... });
}
}
I realise that this design is not ideal. Are there other ways of achieving my requirements? The software being built is a prototype and time is of the essence, so I would like to avoid having to learn complete new technologies (already invested a lot in learning JAX-RS and JPA). Quick-and-dirty is ok, I guess I'm just looking for the least dirty solution ;).
Thanks in advance!

How to periodically refresh a view in vaadin?

I'd like to refresh a certain part of a page periodically. Therefore I created a #Scheduled method that applies the changing values accordingly to the widgets.
But the method never executes:
#Controller
#UIScope
public class MyViewPresenter {
private View view;
#Scheduled(fixedRate = 1000)
public void refresh() {
System.out.println("this is never executed. why?");
//view.change...
}
}
When I move this method into my #Configuration class the sysout is printed fine. So in general I can assume the scheduling works as expected. But not in my presenter class. Why?
You need to enable server push feature 11.16. Server Push
In short:
add vaadin-push library to your dependecies
enable pushing (#Push annotation or servlet configuration)
use UI.access(..) for pushing

Categories