How to test HQL without redeploy all the project? - java

I got a problem that consumes a lot of time during the development i have to test somes HQL that i put in DAO, but i had to recompile all the project in eclipse and put in tomcat that takes something like 40-60seconds just to start again and if something goes wrong... again had to redeploy...
So, there is a way to test a HQL without recompile everything? like i tried the hibernate tools plugin but i don't see how to do it with annotations (the project is all with annotations, don't make use of hbm files...)
Thanks

Set a breakpoint where you need, debug in 'Display' window and write your any HQL queries in runtime.
http://help.eclipse.org/indigo/index.jsp?topic=%2Forg.eclipse.jdt.doc.user%2Freference%2Fviews%2Fdisplay%2Fref-display_view.htm
Maybe helps.

Assuming you are accessing your DAO through a service I would do something like this:
public static void main(String[] args) {
AbstractApplicationContext factory = new ClassPathXmlApplicationContext("application-context.xml");
YourService yourservice = (YourService)factory.getBean("YourService");
YourObject obj = new YourObject("data1", "data2");
yourservice.save(obj);
YourObject foundobj = yourservice.load(1); // or yourservice.findObjectByLabel("label")
System.out.print(foundobj);
}
Or write a junit test. http://www.springbyexample.org/examples/simple-spring-transactional-junit4-test-code-example.html

I would go for the debug option as mentioned by Vaelyr for a punctual need.
If you require stronger assertions I would rather write some tests for the DAO as tshenolo proposed.
But if you have some time and want a nice toy to play with I would create a console page that let you interact with your application.
For that I'll use a groovy (or another script language) interpreter. If you provide the DAO or any other relevant objects to your interpreter context, then you'll have a console to perform all kind of experimentation without to re-compile anything.
You'll be able to run arbitrary code within your app !
For an example with groovy you can have a look here: Embedding Groovy and more precisely here: Embedding a Groovy Console in a Java Server Application
If you don't want to use groovy, you can also have fun with beanshell (pure java) or a rhino (javascript) or any other script language supported by the JVM.
Beware that having this kind of console is a backdoor to your app and that you should not release it as a part of your application.

What I've done in the past to test HQL is write a limited set of integration tests using an in memory db like Hypersonic and the Spring JUnit test extensions. This post describes how this can be done using dbunit. You can also just brute force the data using batch JDBC operations in your setup and tear down.
Cautionary notes: I would not add these test to your suite of unit tests as the data setup and tear down can take more time than a typical unit test. These are really integration tests used to add you in development and debugging of your HQL only. I wouldn't bother testing CRUD operations using these types of tests as then you're just integration testing your ORM framework which should have already been done.

Related

Should we mock in cucumber testing while testing java code. Till what extent we should use cucumber?

I am a Java developer. We want to use cucumber testing in our project. We are working mainly on creating APIs. I am good with unit testing and researching about cucumber.
I am thinking about testing persistence methods - CRUD operations as an starter. My questions is that what could be the scenerios in this testing.
Also should I mock the database by creating tables in the feature file. Should I use mockito with Cucumber to mock call to some other services which connects to database and server.
What should be the cucumber testing in these scenerios and whats the best way to create framework to use cucumber in our Java API's project.
Also, how to populate models if not using database
IMO Gherkin (the language you write Cucumber features in), is good for writing business readable, simple scenarios.
To answer quickly, I would say that Cucumber is not a good fit for testing methods, if it is what you want to do.
As you can see with the file naming convention, you write *.feature files, and I think these files must only contains feature-related descriptions.
However, if you do have features to test, you have to choose how to test them
disconnected, can be run quicky by your CI
you will have to mock everything that cannot start-up in the build lifecycle
and they are solutions to start almost anything using Docker, like Testcontainers
connected to a environment
you do not have to mock anything
your tests may be slower
your tests may break because of the environement (failed deployement, server down, etc.)

Run code before arquillian deployment

I am writing integration tests for a Java EE Servlet using Arquillian + JUnit. I need to be able to execute code before the server launches.
So is it possible to execute code before #Deployment? I tried #BeforeClass with no luck.
The reason I need to do this, is because trust and keystores for ssl needs to exists before the server starts. I am creating the stores problematically and is saving them to files afterwards.
I know a possible workaround would be to have static trust and keystores, but I prefer to create them programmatically before the test starts for full flexibility when writing tests.
There is not really a need to have your own specialization of Arquillian JUnit runner. This solution would be only for JUnit 4.x in that case which you are using for writing your tests.
Arquillian let you hook through extensions mechanism to its runtime and this way you can have some custom logic executed before server startup to provide your keystores. I believe this is more elegant and portable solution.
Please have a look at sample extensions on Github (especially lifecycle would be a good starting point). If you feel like implementing it this way I'm more than happy to help you. The event you might want to observe on is either BeforeSetup or BeforeStart.
You have two other options for executing code before and after your test:
Rules or ClassRules are executed around and before/after
Using a custom Testrunner (extending the default 'Arquillian' runner)
But as the static deployment method is not invoked by a rule, I assume you have to go for the testrunner.

How do I skip a section of code when unittesting in java

I am working on a Java web app with unit/integration tests. App gets deployed to Jetty and uses H2 db while running the integration test phase of maven. I've one oracle function which is called from dao layer which can not be migrated to H2 db, hence i want to mock/skip this part in code while running the test cases.
I thought of having a flag which can tell if i'm running application in test mode and put the condition in code for it, but doesn't look like cleaner approach to me. Please suggest best approach to achieve this.
Extract the Oracle native call into a separate class (probably some DAO). Inject that DAO to class that uses it. Create a second implementation of that DAO, doing nothing in place of calling Oracle. During integration testing inject the latter implementation.
Avoid flags in your code. If you are using Spring, use build profiles that will selectively create one implementation or the other.
That's how dependency injection helps you test your code: if you want to mock some part of the system, just inject mocked version.
Please use some good Mocking frameworks such as mockito or jMock or some other similar mock frameworks.
Please Note: You might be required to re-factor your code to make it more testable.
If the question truly is:
How do I skip a section of code when unittesting in java
then I agree with the answers given. Dependency injection, mocking frameworks are absolutely the right way to go to do true unit testing.
However if the question is:
How do I skip a section of code when using JUnit (or other unit testing framework)
Then I think the answer is "it depends". Sometimes I use JUnit for integration testing - snippets of client code that I run against a test server to save me the trouble of doing these client side tests manually via a GUI. In this case I use system properties for example in my base class I have:
protected boolean skipTest()
{
String port = System.getProperty("jersey.test.port");
// don't run this test unless developer has explicitly set the testing properties
// this is an integration test, not a unit test
return port == null;
}
Then in the actual test class it looks like this:
// verify a successful login
#Test
public void testLogin()
{
if (skipTest())
return;
// do real test
So, my thought is if you really cannot refactor the Oracle stuff out of your DAO, then you really are doing an integration test and it's OK to have a skipTest in your unit test.

How to unit test legacy J2EE application

This may sound like a vague question but I am looking for some specif pointers.
Our J2EE app is built on Struts2 + Plain Servlets + JSP + iBatis + Oracle
I would prefer to write unit tests in Scala so that I can learn the language on the side as well
What would I need to be able to verify that a spcific column is displayed in the JSP following some specific steps
Click on a link. select some parameters and submit the page to the servlet
Verify that the next page has a specific column inside its <table> tag.
What would I need to create mock requests for the serlvet?
I am trying to write tests like above in addition to core business functionality tests however, the problem is that I am trying to wrap legacy code in unit tests and the code of course is not designed for unit testing.
I wouldn't call this unit testing. As you are trying to test integration of several units. Also it's rather hard to create a unit test for a JSP becuase it has many context dependencies available only when you are in the container.
Instead I would advice writing some automated functional tests that are executed against running (deployed) application.
Frameworks like Selenium may be of great help here as they allow to simulate real user behaviour and make asserts against produced HTML code.
EDIT: Another approach here may be to:
start an embedded servlet container like Jetty within your test code
deploy all your plain servlets and JSPs to that
replace Oracle database with in-memory database like HSQL or Derby
populate it with some test data using DBUnit
and then again use either Selenium (which has Java binding) or HttpUnit to make requests and asserts against generated HTML code.
But again it will not be a unit test, but rather an integration test.
Like everyone said, your not really talking about unit testing. You're talking about functional testing. I'd think hard about what your real goals are. What is driving this push for automated testing? Does the application have configuration issues(i.e. its hard to configure so some parts work and others don't). This might justify building a smoke test suite in selenium targeting your pain pages and test cases. This will also help detect regression bugs.
As for the legacy concerns. No application is beyond help. If you are running front end tests in selenium then it doesn't matter how the code is written as long as its parseable HTML.
As for your actual server side code. You just gotta roll Andy Dufresne style. As you fix bugs and add functionality code with Test Driven Development principles in mind. Rework code that relates to your changes and add unit tests. You'd be surprised at how fast a legacy app can come around if you keep chipping away at it.

Data-driven tests with jUnit

What do you use for writing data-driven tests in jUnit?
(My definition of) a data-driven test is a test that reads data from some external source (file, database, ...), executes one test per line/file/whatever, and displays the results in a test runner as if you had separate tests - the result of each run is displayed separately, not in one huge aggregate.
In JUnit4 you can use the Parameterized testrunner to do data driven tests.
It's not terribly well documented, but the basic idea is to create a static method (annotated with #Parameters) that returns a Collection of Object arrays. Each of these arrays are used as the arguments for the test class constructor, and then the usual test methods can be run using fields set in the constructor.
You can write code to read and parse an external text file in the #Parameters method (or get data from another external source), and then you'd be able to add new tests by editing this file without recompiling the tests.
This is where TestNG, with its #DataSource, shines. That's one reason why I prefer it to JUnit. The others are dependencies and parallel threaded tests.
I use an in-memory database such as hsqldb so that I can either pre-populate the database with a "production-style" set of data or I can start with an empty hsqldb database and populate it with rows that I need to perform my testing. On top of that I will write my tests using JUnit and Mockito.
I use combination of dbUnit, jMock and jUnit 4. Then you can ether run it as suite or separately
You are better off extending TestCase with a DataDrivenTestCase that suits your needs.
Here is working example:
http://mrlalonde.blogspot.ca/2012/08/data-driven-tests-with-junit.html
Unlike parameterized tests, it allows for nicely named test cases.
I'm with #DroidIn.net, that is exactly what I am doing, however to answer your question literally "and displays the results in a test runner as if you had separate tests," you have to look at the JUnit4 Parameterized runner. DBUnit doesn't do that. If you have to do a lot of this, honestly TestNG is more flexible, but you can absolutely get it done in JUnit.
You can also look at the JUnit Theories runner, but my recollection is that it isn't great for data driven datasets, which kind of makes sense because JUnit isn't about working with large amounts of external data.
Even though this is quite an old topic, i still thought of contributing my share.
I feel JUnit's support for data driven testing is to less and too unfriendly. for eg. in order to use parameterized, we need to write our constructor. With Theories runner we do not have control over the set of test data that is passed to the test method.
There are more drawbacks as identified in this blog post series: http://www.kumaranuj.com/2012/08/junits-parameterized-runner-and-data.html
There is now a comprehensive solution coming along pretty nicely in the form of EasyTest which is a a framework extended out of JUnit and is meant to give a lot of functionality to its users. Its primary focus is to perform Data Driven Testing using JUnit, although you are not required to actually depend on JUnit anymore. Here is the github project for refernece: https://github.com/anujgandharv/easytest
If anyone is interested in contributing their thoughts/code/suggestions then this is the time. You can simply go to the github repository and create issues.
Typically data driven tests use a small testable component to handle the data. (File reading object, or mock objects) For databases, and resources outside of the application mocks are used to similate other systems. (Web services, and databases etc). Typically I see is that there are external data files that handle the data and the output. This way the data file can be added to the VCS.
We currently have a props file with our ID numbers in it. This is horribly brittle, but is easy to get something going. Our plan is to initially have these ID numbers overridable by -D properties in our ant builds.
Our environment uses a legacy DB with horribly intertwined data that is not loadable before a run (e.g. by dbUnit). Eventually we would like to get to where a unit test would query the DB to find an ID with the property under test, then use that ID in the unit test. It would be slow and is more properly called integration testing, not "unit testing", but we would be testing against real data to avoid the situation where our app runs perfectly against test data but fails with real data.
Some tests will lend themselves to being interface driven.
If the database/file reads are retrieved by an interface call then simply get your unit test to implement the interface and the unit test class can return whatever data you want.

Categories