i have a play 2.3 application that I want to test.
This application has a Global.java class that extends GlobalSettings in order to start a recurring Akka task every 5minutes.
During testing I don't want the task to be scheduled since it creates several issues and I don't need it.
Therefore I would like to override the GlobalSettings.
By reading the documentation, it looks like it should be possible to use a FakeApplication for that. However I tried to do that in several ways and the framework still runs my default global settings.
I created a base class for my tests that looks like this:
public class BaseTest
extends WithApplication
{
protected FakeApplication provideFakeApplication()
{
return fakeApplication(inMemoryDatabase("test"), new GlobalSettings());
}
}
According to the documentation, if a test class extends WithApplication a fake application should automatically start for me, with the configuration provided.
Disregarding if this happens or not, even before the testing methods are called, the default global settings trigger. The "new Global()" doesn't override the default.
I also tried to manually start the fakeApplication using a #BeforeClass annotation, with no success.
I am running the tests with the "activator test" command.
It looks like that the fakeApplication is indeed used for each test, but before even the first test starts, the main application is started and its global triggered. And that's what I don't want it to happen.
Am I doing something wrong or is it a bug in play? if it is a bug, is there a workaround?
EDIT: I just noticed that even the database settings don't get overridden correctly. I normally use a h2 file database for developing, but i want a inmemory, different one for testing. However by using the code above doesn't change the database used, and therefore my tests run against my file DB.
I also tried something like this:
#Test
public void testMyTest()
{
running(fakeApplication(inMemoryDatabase("test2")), () -> {
//TESTING CODE THAT USES DB
});
}
and still any query inside the body runs against the DB configured in the config file, not the inmemory database.
Edit
Chafik solution kind of worked for me, since by specifying a different config file in the build.sbt file I managed to override my settings. Things are still really wierd though:
1)Now if from my fakeApplication constructor I try to override the GlobalSettings by passing a new instance in helper method, the settings are correctly overridden, while before I could not at all override the main one
2)If I revert my change and don't supply a test conf file, I can still override the globalsettings. That is, the behaviour is different than it used to be initially.
Something is definitely strange around the test command, its configuration, running scope, and the way fakeApplication override the configuration, and/or the documentation about it si definitely unclear and lacking. However since at least I achieved what I wanted to do I'll still consider the answer solved.
I did what you want like this.
Set a different config file for testing in your build.sbt
javaOptions in Test += "-Dconfig.file=conf/application.test.conf"
Create conf/application.test.conf
Include main configuration file at the beginning include "application.conf"
Override the settings that you want
Create a property like this startAkkaActor=true in the main config file
Create a property like this startAkkaActor=false in the test config file
Update you Global.java where you start your Akka actor
if (Play.application().configuration().getBoolean("startAkkaActor")) {
// Start your Akka actor
}
You can do the same with your database settings
The config file must be defined in build.sbt because Play forks JVM for each test without duplicating the parameters set in the main JVM. The following does not work :
activator test -Dconfig.file=conf/application.test.conf
Related
I have a bit of code that looks like this:
#Value("#{systemProperties['TARGET_ENVIRONMENT']?: 'qa'}")
private String environment;
In my integration tests, this property is always blank, and I can't seem to do anything to give it definition. I tried doing something like this in my integration tests to simply override the target environment attribute to something else:
#Before
public void setUp(){
System.setProperty("TARGET_ENVIRONMENT", "I love dogs");
}
But that didn't really work out.
#Before
public void setUp(){
System.setProperty("TARGET_ENVIRONMENT", "qa"); -> Not I love Dogs
}
#Value("$peopleSearch{key}") -> Defined as per properties file
private String peopleSearch;
#Value("$peopleSearch{address}") -> Defined as per properties file
private String address;
But that doesn't really change anything. I have other properties (as above) which are being defined in properties files that work out ok and get values, but this one seems to be using the systemProperties attribute, which I have no idea how/where to modify. What do I do to override systemProperties attributes?
systemProperties isn't a function but variable. And it's initialized at the time Spring context is loaded. That's why setting system properties in runtime doesn't effect in your tests. Here you can find more details of SpringEL.
You can pass the system property -DTARGET_ENVIRONMENT to the runner of the test. For example, if you run the test in an IDE (Eclipse or IntelliJ), you can pass it under JVM arguments of the Run Configurations. Or if you run it by Maven, see this.
This surely is a common problem. I have a properties file like my-settings.properties which is read by an application class. When I write a test class, it needs to test different scenarios of things that could be present in my-settings.properties in order to ensure maximum code coverage (e.g. empty properties file, basic properties file etc). But I can only have one my-settings.properties in my src/test/resources.
What would be really great is if there was just some annotation
#MockFileOnClassPath(use = "my-settings-basic.properties", insteadOf = "my-settings.properties")
Then I could just have multiple my-settings-XXX.properties files in my /src/test/resources and just annotated the correct one on each test method. But I can't find anything like this. I'm using JUnit 4.12.
I can think of a couple of crude solutions:
Before each test, find the file on the file system, copy it using filesystem I/O, then delete it again after the test. But this is clumsy and involves a lot of redundancy. Not to mention I'm not even sure whether the classpath directory will be writable.
Use a mocking framework to mock getResource. No idea how I would even do that, especially as there are a million different ways to get the file (this.getClass().getResourceAsStream(...), MyClass.class.getResourceAsStream(...), ClassLoader.getSystemClassLoader().getResourceAsStream(...) etc.)
I just think this must be a common problem and maybe there is already a solution in JUnit, Mockito, PowerMock, EasyMock or something like that?
EDIT: Someone has specified that this question is a duplicate of Specifying a custom log4j.properties file for all of JUnit tests run from Eclipse but it isn't. That question is about wanting to have a different properties file between the main and test invocations. For me I want to have a different properties file between a test invocation and another test invocation.
I find that whenever dealing with files, it's best to introduce the concept of a Resource.
eg:
public interface Resource {
String getName();
InputStream getStream();
}
Then you can pass the resource in via dependency injection:
public class MyService {
private final Properties properties;
public class MyService(Resource propFile) {
this.properties = new Properties();
this.properties.load(propFile.getStream());
}
...
}
Then, in your production code you can use a ClasspathResource or maybe a FileResource or URLResource etc but in your tests you could have a StringResource etc.
Note, if you use spring you already have an implenentation of this concept. More details here
You can change your Service class to accept the name of the resource file, then then use that name to load the resource.
public class MyService {
public MyService(String resourceFileName){
//and load it into Properties getResourceAsStream(resourceFileName);
}
}
I have util class with one method:
public static void setStyleForWidgetLayout(HTMLPanel panel, int rowQuantity) {
Which for each Widget in HTMLPannel assign widgth depends on number of widgets and rows.
It is very simple switch.
I want to test this method but when i create test with normal JUnit test case i recieve error on creating HTMLPanel.
Caused by: java.lang.UnsupportedOperationException: ERROR: GWT.create() is only usable in client code! It cannot be called, for example, from server code. If you are running a unit test, check that your test case extends GWTTestCase and that GWT.create() is not called from within an initializer or constructor.
Next I tried to extend GWTTestCase but it requires to implement getModuleName() but i dont have any particular module because the class I test is just util class
Ok, I thought, lets mock it. I used mockito, but the errror occured, same as from previous code section.
Than I found class GwtTestWithMockito and tried to run this and again I recieved error.
com.googlecode.gwt.test.exceptions.GwtTestConfigurationException: No declared module. Did you forget to add your own META-INF/gwt-test-utils.properties file with a 'gwt-module' property in the test classpath?
I added #GWTModule and this META-INF/gwt-test-util.properties. And experimented with different configurations. I tried existing module names and not existing ones but still i recieve error above.
Thanks in advance
I'm afraid you have to use GWTTestCase. Your module name is the name of your module XML file.
Sometimes you have to add your Panel to the root panel, I generally do this just to be on the safe side.
I have searched around but have been unsuccessful in determining whether the following approach is possible / good practice. Basically what I would like to do is the following:
Create JUnit Tests which use DBUnit to initialize the data. Run multiple test methods each which run off of the same initial dataset. Rollback after each test method back to the state right after the initial setUp function. After all test methods have run rollback any changes that were made in the setUp function. At this point the data in the database should be exactly the same as it was prior to running the JUnit test class.
Ideally I would not have to reinitialize the data before each test case because I could just roll back to the state right after setUp.
I have been able to roll back individual test methods but have been unable to roll back changes made in setUp after all test methods have run.
NOTE: I am aware of the different functionality of DBUnit such as CLEAN_INSERT, DELETE etc. I am using the Spring framework to inject my dataSource.
An example layout would look like:
public class TestClass {
public void setUp() {
// Calls a method in a different class which uses DBUnit to initialize the database
}
public void runTest1() {
// Runs a test which may insert / delete data in the database
// After running the test the database is in the same state as it was
// after running setUp
}
public void runTest2() {
// Runs a test which may insert / delete data in the database
// After running the test the database is in the same state as it was
// after running setUp
}
// After runTest1 and runTest2 have finished the database will be rolled back to the
// state before any of the methods above had run.
// The data will be unchanged as if this class had never even been run
}
I would be running the tests in a development database however I would prefer to not affect any data currently in the database. I am fine with running CLEAN_INSERT at the start to initialize the data however after all test methods have run I would like the data back to how it was before I ran my JUnit test.
Thanks in advance
Just as "setUp" JUnit offers a "tearDown" method executed after each test method. that you could use to rollback. Also starting with JUnit 4 you have the following annotations:
#BeforeClass: run once before running any of your tests in the test case
#Before: run every time before a test method
#After: run every time after a test method
#AfterClass: run once after all your tests in the current suite have been executed
We solved a similar problem at the oVirt open source project. Please take a look at the code residing at engine\backend\manager\modules\dal\src\test\java\org\ovirt\engine\core\dao\BaseDAOTestCase.java.
In general look at what we did there at the #BeforeClass and #AfterClass methods. You can use it on per method basis. We used the Spring-test framework for that.
I am using Spring's declarative transactions (the #Transactional annotation) in "aspectj" mode. It works in most cases exactly like it should, but for one it doesn't. We can call it Lang (because that's what it's actually called).
I have been able to pinpoint the problem to the load time weaver. By turning on debug and verbose logging in aop.xml, it lists all classes being woven. The problematic class Lang is indeed not mentioned in the logs at all.
Then I put a breakpoint at the top of Lang, causing Eclipse to suspend the thread when the Lang class is loaded. This breakpoint is hit while the LTW weaving other classes! So I am guessing it either tries to weave Lang and fails and doesn't output that, or some other class has a reference that forces it to load Lang before it actually gets a chance to weave it.
I am unsure however how to continue to debug this, since I am not able to reproduce it in smaller scale. Any suggestions on how to go on?
Update: Other clues are also welcome. For example, how does the LTW actually work? There appears to be a lot of magic happening. Are there any options to get even more debug output from the LTW? I currently have:
<weaver options="-XnoInline -Xreweavable -verbose -debug -showWeaveInfo">
I forgot tom mention it before: spring-agent is being used to allow LTW, i.e., the InstrumentationLoadTimeWeaver.
Based on the suggestions of Andy Clement I decided to inspect whether the AspectJ transformer is ever even passed the class. I put a breakpoint in ClassPreProcessorAgent.transform(..), and it seems that the Lang class never even reaches that method, despite it being loaded by the same class loader as other classes (an instance of Jetty's WebAppClassLoader).
I then went on to put a breakpoint in InstrumentationLoadTimeWeaver$FilteringClassFileTransformer.transform(..). Not even that one is hit for Lang. And I believe that method should be invoked for all loaded classes, regardless of what class loader they are using. This is starting to look like:
A problem with my debugging. Possibly Lang is not loaded at the time when Eclipse reports it is
Java bug? Far-fetched, but I suppose it does happen.
Next clue: I turned on -verbose:class and it appears as if Lang is being loaded prematurely - probably before the transformer is added to Instrumentation. Oddly, my Eclipse breakpoint does not catch this loading.
This means that Spring is new suspect. there appears to be some processing in ConfigurationClassPostProcessor that loads classes to inspect them. This could be related to my problem.
These lines in ConfigurationClassBeanDefinitionReader causes the Lang class to be read:
else if (metadata.isAnnotated(Component.class.getName()) ||
metadata.hasAnnotatedMethods(Bean.class.getName())) {
beanDef.setAttribute(CONFIGURATION_CLASS_ATTRIBUTE, CONFIGURATION_CLASS_LITE);
return true;
}
In particular, metadata.hasAnnotatedMethods() calls getDeclaredMethods() on the class, which loads all parameter classes of all methods in that class. I am guessing that this might not be the end of the problem though, because I think the classes are supposed to be unloaded. Could the JVM be caching the class instance for unknowable reasons?
OK, I have solved the problem. Essentially, it is a Spring problem in conjunction with some custom extensions. If anyone comes across something similar, I will try to explain step by step what is happening.
First of all, we have a custom BeanDefintionParser in our project. This class had the following definition:
private static class ControllerBeanDefinitionParser extends AbstractSingleBeanDefinitionParser {
protected Class<?> getBeanClass(Element element) {
try {
return Class.forName(element.getAttribute("class"));
} catch (ClassNotFoundException e) {
throw new RuntimeException("Class " + element.getAttribute("class") + "not found.", e);
}
}
// code to parse XML omitted for brevity
}
Now, the problem occurs after all bean definition have been read and BeanDefinitionRegistryPostProcessor begins to kick in. At this stage, a class called ConfigurationClassPostProcessor starts looking through all bean definitions, to search for bean classes annotated with #Configuration or that have methods with #Bean.
In the process of reading annotations for a bean, it uses the AnnotationMetadata interface. For most regular beans, a subclass called AnnotationMetadataVisitor is used. However, when parsing the bean definitions, if you have overriden the getBeanClass() method to return a class instance, like we had, instead a StandardAnnotationMetadata instance is used. When StandardAnnotationMetadata.hasAnnotatedMethods(..) is invoked, it calls Class.getDeclaredMethods(), which in turn causes the class loader to load all classes used as parameters in that class. Classes loaded this way are not correctly unloaded, and thus never weaved, since this happens before the AspectJ transformer registered.
Now, my problem was that I had a class like so:
public class Something {
private Lang lang;
public void setLang(Lang lang) {
this.lang = lang;
}
}
Then, I had a bean of class Something that was parsed using our custom ControllerBeanDefinitionParser. This triggered the wrong annotation detection procedure, which triggered unexpected class loading, which meant that AspectJ never got a chance to weave Lang.
The solution was to not override getBeanClass(..), but instead override getBeanClassName(..), which according to the documentation is preferable:
private static class ControllerBeanDefinitionParser extends AbstractSingleBeanDefinitionParser {
protected String getBeanClassName(Element element) {
return element.getAttribute("class");
}
// code to parse XML omitted for brevity
}
Lesson of the day: Do not override getBeanClass unless you really mean it. Actually, don't try to write your own BeanDefinitionParser unless you know what you're doing.
Fin.
If your class is not mentioned in the -verbose/-debug output, that suggests to me it is not being loaded by the loader you think it is. Can you be 100% sure that 'Lang' isn't on the classpath of a classloader higher in the hierarchy? Which classloader is loading Lang at the point in time when you trigger your breakpoint?
Also, you don't mention AspectJ version - if you are on 1.6.7 that had issues with ltw for anything but a trivial aop.xml. You should be on 1.6.8 or 1.6.9.
How does ltw actually work?
Put simply, an AspectJ weaver is created for each classloader that may want to weave code. AspectJ is asked if it wants to modify the bytes for a class before it is defined to the VM. AspectJ looks at any aop.xml files it can 'see' (as resources) through the classloader in question and uses them to configure itself. Once configured it weaves the aspects as specified, taking into account all include/exclude clauses.
Andy Clement
AspectJ Project Lead
Option 1) Aspect J is open source. Crack it open and see what is going on.
Option 2) Rename your class to Bang, see if it starts working
I would not be surprised if there is hard coding to skip "lang' in there, though I can't say why.
Edit -
Seeing code like this in the source
if (superclassnameIndex > 0) { // May be zero -> class is java.lang.Object
superclassname = cpool.getConstantString(superclassnameIndex, Constants.CONSTANT_Class);
superclassname = Utility.compactClassName(superclassname, false);
} else {
superclassname = "java.lang.Object";
}
Looks like they are trying to skip weaving of java.lang.stuff.... don't see anything for just "lang" but it may be there (or a bug)