Passing server parameters as arquillian resource? - java

I'm using arquillian for functional tests only. Arquillian is not managing a container (standalone) and is not deploying an app (also done manually. Since there's no deployment I can't obtain deploymentUrl using #ArquillianResource.
Also, it's a maven project and I have a property with server hostname which is pretty much what I need to get in arquillian test.
My question is: what would be another option to acquire a url except for hard coding it?

If the tests are run in client JVM, you can probably use system properties. For example, with maven it could be:
$ mvn test -Ddeployment.url=http://whatever
And in test code:
String url = System.getProperty("deployment.url", "http://defaulturl");

Related

Azure release pipeline not picking up variable from nestedStack.yml file

I'm configuring our Release pipeline so that Integration Tests are automatically run after pull requests are merged to master and deployed to Dev environment.
I'm currently getting a connection error, specifically java.net.UnknownHostException: and it looks like my one of my output variables from my nestedStack.yml code is not being imported/read properly:
my-repo/cloud-formation/nestedStack.yml
You can see there is a property there "ApiGatewayInvokeUrl" which is marked as an Output. It is used on Azure DevOps for the "Integration Testing" task in my "Deploy to Dev" stage. It is written as $(ApiGatewayInvokeUrl) as that's how variables on Azure DevOps are used.
This Deploy to Dev will "succeed", however when I further inspect the Integration Tests, I see none actually ran and there was a connection error immediately. I can see it is outputting the variable as $(ApiGatewayInvokeUrl) , so it looks like it just reads it as a String, and never substitutes it for the correct URL value:
I was going off the way another team set up there Integration Tests on a similar pipeline but I might have missed something. Do I need to define $(ApiGatewayInvokeUrl) somewhere in my codebase, or somewhere on Azure? Or am I missing something? I checked the other teams code and didn't see them define it anywhere else, that's why I am ultra confused.
Update: I went into AWS API Gateway and copied the invoke URL and hard-coded that into the Azure DevOps Maven (integration testing) goal path and now it's connecting. So it's 100% just not importing that variable for somer reason.
you need to define\create the variable to use it (unless its an automatic variable, and this one is definitely not an automatic variable). that variable isnt getting substituted because it doesnt exist (afaik).

Use log4j on a deployed war

I've made a trivial RESTful service on a JBoss server that just says "hello" when it receives any request. This part works and it is already deployed.
To go further with my project I need to log. The other applications on the server use log4j configured by a log4j.xml placed on a specific folder on the server.
For an offline project I'm used to have a runnable main method that in this case I would use to execute DOMConfigurator.configure(filepath) (where filepath is the log4j.xml) and I will be expecting to log with those settings.
However, since this is a deployed service (and since I'm not skilled enough to figure it myself) how would I so such a thing?
The question in my opinion could be interpreted in two ways:
How do I run some code "for sure" when I deploy something (in similar way to a main method) ? Do i need something like spring-boot?
How do I setup log4j on a deployed service on JBoss?
(I don't know if one question excludes the other...)
Thanks for the help!
1) If you want to run some code "for sure" you can create #PostConstruct method in one of your beans. Both Spring and EJB support it.
2)As for log4J configuration it should be enough to put it in classpath and add corresponding dependencies, no explicit configuration of path should be needed.

Integration tests fails if run via maven, but passes if run via IDE

I have an acceptance test module that does an ejb call to another module. But for this acceptance test, I don't want it call the ejb directly as this may cause my tests to be very slow. So to over come that, I used SimpleNamingContextBuilder to mock the JNDI call. For my acceptance test, I am using Cucumberframework.
I created my own annotation #mockJNDIServices, the java method that does the JNDI looks like
#Before("#MockJndiEntityService")
public void mockJndiService() throws NamingException {
final EjbServiceRemote ejbService = new EjbServiceRemoteStubImpl();
final SimpleNamingContextBuilder contextBuilder = new SimpleNamingContextBuilder();
contextBuilder.bind(EjbServiceRemote.EJB_NAME, ejbService);
contextBuilder.activate();
}
EjbServiceRemoteStubImpl is a stub I would like tests to call instead of the service itself.
and the feature file has the heading
#MockJndiEntityService
#transaction
Feature: Lookups from library of policy
When I run this code via the IDE (Intellij), it works fine and everything passes.
But when I run it through the build tool (Maven), I get the remote lookup failure
Caused by: javax.naming.NameNotFoundException: Name
[ejb/EjbServiceEJB#ejb.service.EjbServiceRemote]
not bound; 1 bindings:
[ejb/EjbServiceEJB#ejb.service.EjbServiceRemote
It seems like the annotation is not being applied for each feature/scenarios.
Has anyone ever came across this? your help will be much appreciated.

Arquillian and Selenium in mixed Container/Client mode

i am reading the tutorial on Arquillian's website
http://arquillian.org/guides/functional_testing_using_drone/
Under the paragraph of "Enabling Client Mode" they state that it is possible to mix in-container and client modes in the same test! Just leave off the testable attribute. Any method annotated with #RunAsClient will execute from the client, the remainder will execute inside the container, giving you the best of both worlds!
Here is my Issue.
I want to write a test that users
#Drone
DefaultSelenium browser and
#EJB
MyXXXRepository
I have one test that will add a user to the InMemory database before i have a Selenium test which logs in on the browser with that user...
So in order to get Selenium to work i need to tell the #Deployment to be testable=false, this will cause my #EJB to fail.
So according to the documentation i can skip the testable=false, if i tell the Selenium Test Method that it should run in Client Mode. According to the documentation this should work.
But!!!
This will throw an Exception
Caused by: java.lang.NoClassDefFoundError: Lcom/thoughtworks/selenium/DefaultSelenium;
So i need to be able to tell the
#Drone
DefaultSelenium browser;
To be in Client Mode as well...
Any takers?
Drone is intended to be client side. Personally I've never tried to deploy WebDriver/Drone tests and run it from the server. This sounds a bit crazy :) And obviously since the test itself is mixed classloader complains about the Drone-related imports.
But I have a solution for you which lets you test from the "grey-box" perspective. There is a fairly new extension in Arquillian universe called Warp which allows you to solve your very problem. Here's the guide.
Hope that helps.
I solved the problem by using an import script that will import the user prior to the test, this way i do not need to instantiate the repository and it is now a clean client side test.

Integration Testing with Redis

I've started using Redis in my project with the help of the Jedis library. All is working fine but now I have a problem that my functional tests requires Redis to be up which I want to avoid in my Continuous Integration. What is the best way to do this?
I've implemented a simple redis embedded runner for Java:
https://github.com/kstyrc/embedded-redis
Currently, it uses redis 2.6.14 for*nix and https://github.com/MSOpenTech/redis for Windows. However you can utilize RedisServer class to run your own run script.
I plan to extend the implementation to support RedisConf (bind, slaveof, port, dbfilename, etc). After this, I'll upload jar to clojars for mvn deps.
Here are few options you have for functional/integration testing:
Just start an instance of redis on you CI server. All tests will be responsible to do proper clean up after execution.
Try somehow to control redis process, i.e. have some shell script or job on CI server to start/stop it before/after running tests. At least some of the burden of setup/cleanup is removed from tests, as for each independent build you will have independent redis setup.
Control redis further by using some in-memory solution like the one you mention for cassandra(if it exists).
One thing to mention is that integration tests should not replace unit tests. Unit tests should probably be preferred and they can cover more cases whereas integration tests can be used just to check that all parts of application play nicely together. And i think this is the reason why a lot of people choose to go for option number one.
Here is a similar question about mongodb The answer has a link to the project which works for second option(controls mongodb process) If you follow some related links on the project page there's also something called nosql-unit. This one i think tries to cover option three. I didn't use it but looks like it has something for redis too.
You can start Redis server on an arbitrary port via the command line: redis-server --port 7777. So for the purposes of integration testing, you could start on Redis on an available (or random) port, making sure that Jedis is configured to use that port.
In this way, you've got a "fresh" instance of Redis that you know won't conflict with any other processes, including other test runs occurring at the same time. This is as close as I can think of to the analogue of running an in-memory/embedded database for integration testing.
For pre-loading Redis with "canned data," use the --dbfilename <file> switch: redis-server --port 7777 --dbfilename test.rdb.
try nosql-unit. It supports redis unit test with java.
I have tried EmbeddedRedis and found that many Jedis interfaces are not supported. Hence using EmbbededRedis is not a good idea, especially when you are using some advanced redis function like "pipeline".
I suggest using ManagedRedis for unit test:
download a redis source code from redis.io into your test resource
build a redis-server in the $(your-redis-dir)/src
write a unit test with ManagedRedis, here is an example. Note that "REDIS_HOME" is the dir where your redis code downloaded, ManagedRedis will find redis-server in ${REDIS_HOME}/src
run you unit test.
As #ksytrc mentioned in his answer I basically used his solution. It was working in this project.You just need to add embedded-redis dependency.
<dependency>
<groupId>com.github.kstyrc</groupId>
<artifactId>embedded-redis</artifactId>
<version>0.6</version>
<scope>test</scope>
</dependency>
then in test class define the redisServer
RedisServer redisServer;
#Before
public void setUp() throws IOException {
redisServer = new RedisServer();
redisServer.start();
}
Also define application.yml with below credentials.
spring:
redis:
host: localhost
port: 6379
The better way that I could handle this same problem was create a Spring service that handle the RedisTemplate. After that, I just use #MockBean to mock the service and abstract the lack of a Redis instance running during my tests.
Per example:
#Service
class RedisService {
#Autowired
private RedisTemplate<String, SomeClass> redisTemplate;
SomeClass get(String key) {
return redisTemplate.opsForValue().get(key);
}
}
And in the integration test:
class IntegrationTest {
#MockBean
private RedisService redisService;
#Before
public void setup() {
SomeClass someClass= new SomeClass();
when(redisService.get(anyString())).thenReturn(someClass);
}
}
I'm skeptical to use some redis memory database solution, because I understand that the actual alternatives is not "officially" recommended by the Spring team.

Categories