It seems to me that I am fundamentally misunderstanding the purpose of Robolectric. I've been battling with it for a week already, and so far getting a new error message is considered as making progress. I am able to test some basic stuff like static views in an activity, but when something more complicated things come into play things just fall apart. I had to extend Robolectric to support 3-rd party libraries with certain parameters, Appcompat action bars and numerous other things which was extremely time-consuming and wasn't really documented anywhere, and things are advancing at a pretty much glacial pace. I am starting to think that I am using it in a wrong way and it simply isn't supposed to do what I want it to do.
The general app logic is quite straightforward so there isn't really much to unit test, the most complicated stuff is in the UI and remote API calls. Is Robolectric just supposed to make unit testing for Android less painful than with JUnit because it can run on the JVM and supports a few Android classes? Perhaps a black-box behaviour testing framework like Espresso would be more suitable for my needs? But we use continuos integration, and Robolectric was nice and easy to set up to run tests on the CI server, and I'd kind of like to keep it that way.
What do you use Robolectric for? A lot of blog posts recommend it for "activity lifecycle testing", but since I'm also quite new in the Android world, I don't really understand the purpose of it, especially since the app I'm testing is portrait-only. Could someone please give an overview of what you use Robolectric for, and how do you do it, preferably with code examples and explain why and how those tests are important?
We use it for:
unit testing: all components from parsers and utils, to controllers and presenters
integration/acceptance testing: the business logic of the app, per screen (which falls into integration and/or acceptance testing)
We don't use it for (and have found it difficult to use for these):
testing the network layer (we run all tests by injecting the test data in the same way the network layer would; parsers are tested separately)
user flows through different screens
If you're looking for more of the latter, perhaps Espresso/Robotium are better suited for your needs. And you absolutely can run these as part of your CI pipeline, but you'll need to invest some time in setup, or integrating with something like Appurify.
If you are finding it very difficult to write your tests, it might have to do more with the way your app is architected than the way you're using robolectric. See my answer here as well, it might help you: Writing Android acceptance tests with robolectric: how could it be done?
Related
I am planning to write some UI tests for a legacy Java EE application. Can anyone with similar experience recommend a tool for this? (I am thinking of going ahead with Selenium). Also do you recommend putting these tests as a part of the CD pipeline (my concern is because they are usually a little fragile)? (It will be great if you guys could share some strategies e.g. part with CD part as a separate daily regression)
Thanks
Try this as a good pointer. Re-think your approach!
http://googletesting.blogspot.co.uk/2015/04/just-say-no-to-more-end-to-end-tests.html
Also read about the Test Pyramid.
http://martinfowler.com/bliki/TestPyramid.html
Some people will disagree... depends on the calibre of teams you're working with, and the levels of stress they're under to deliver!
The basic question is "How should one start with writing unit and integration testing for a untested project? Especially considering the fact that the person is not familiar with the code and has not done integration testing before."
Consider the scenario where unit tests and integration tests have to be written for a project. The project uses Java/J2EE technology does not have any tests written at all.
The dilemma that I face is since I have not written the code, I don't want to refactor the code immediately to write tests. I also have to select a testing framework. I am thinking of using Mockito and Powermock.
I also have to estimate code coverage for the tests. And then perform integration testing. I will have to research on integration testing tools and select one. I have not done any integration testing or estimated acceptable level of code coverage for a project before.
Since I am working independently, if there are some strategies, tips, suggestions on what should I start with and tools that one can recommend, I will appreciate it.
First comes first:
Understand the architecture, what are the main components?
If you have no good overview of the features and functions the program offers, make a list of them and create a hierarchy of them
Get familiar with the code, I recommend the following approach:
after you understand where the code of its different components are started, try to figure out the method invocation hierarchy (in Eclipse you can easily jump the source code definitions by pressing F3)
later you can do the same, while debugging the code, this way it will jump automatically to the definitions, plus you can observe how the state of the program changes
For Unit Testing itself, I can recommend Clean Code Chapter 9 (circa 12 pages) for starters. It uses JUnit for the example and gives a very good introduction how good testing is done.
There you will learn things like the F.I.R.S.T. principle, that Unit Tests should be:
Fast, Independent, Repeatable, Self-Validating and Timly
Some clarifications, JUnit is the most used and accepted test framework itself. Mockito and Powermock are mocking frameworks, they are used together with JUnit when you want to do integration tests.
For code coverage I can only recommend Cobertura, but there are many more.
Start with unit tests before you dive into integration tests (bottom-up), you can also do it the other way around (top-bottom), but since you say you are not so much experienced I would stay stick to the first.
Finally, just go for it and get started. You will learn the most and fastest while actually writing the test code.
Stop. "..not familiar with the code..". First get familiar with the code and most importantly its expected functionality. You can't refactor or unit test a code that you are not comfortable with.
Since you have not done unit-tests before, I would suggest learning and getting convenient with unit-tests.
Important: Bad/Wrong unit-tests are worse than no unit-tests. This is because the next guy who will maintain your code will misinterpret
the functionality.
There are bunch of Code Coverage tools out there. You can use which ever seduces you better.
Adding tests to legacy code that has no tests is a difficult task. As #Suraj has mentioned, get familiar with the code base and the expected functionality. You can't test it if you don't know what it is supposed to do.
In terms of choosing which areas of the code to test. Start with the high business value areas. Which functionality is most important? You want to make sure you have a strong test set for that code.
Since you don't have any unit/integration tests, I would start with some high level end to end tests that at least ensure that given some inputs to the system you get some expected outputs. This doesn't ensure correctness but at least ensures consistency.
Then as you develop a test suite you can be confident that the refactorings you are doing are not changing the behavior of the code (unless you find bugs of course that are being fixed).
For testing frameworks, JUnit is the standard unit testing framework. Note that the frameworks Mockito and Powermock are not testing frameworks themselves, but they can be used within JUnit.
For acceptance tests, there are also a variety of frameworks to help. For web UI testing, Selenium is pretty standard. There are also tools like Fitnesse for more table driven testing.
There are also some common frameworks to help with code coverage - Cobertura, Emma, Clover come to mind.
I would also set up an automated build (Jenkins build server is pretty simple to set up). This will allow you to run your tests on every checkin. Even though your code coverage is going to be low to start, getting in this habit is a good one.
I have been using the guard framework in rails for quiet a while and I have just fallen in love with it. I also work in Java a lot and I find it surprising that such a tool is not available for rapid test driven development. Here's what guard does for you (form the original website)-
File system changes handled by our awesome Listen gem.
Support for visual system notifications.
Huge (more than 120) guard extensions eco-system.
Tested against Ruby 1.8.7, 1.9.2, 1.9.3, REE and the latest versions of JRuby & Rubinius.
In essence, it helps me keep an eye on test cases while I am making changes or adding stuff to my codebase. The following are the benefits of such an approach-
Unobtrusive test case driven development
Instant acknowledgement of the impact of a code change
High quality code
Minimized regression ripples
Does anyone know how to achieve similar goals in Java?
Note- Automated test tools like Hudson is not an answer I am looking for. I need something that can be used on local development machines/environment so there is an instant test case feedback on a code change.
Thanks
Not sure what you are looking for ... but if you search for a tool that runs your unit tests in the background during development have a look at Infinitest or JUnit Max.
If you work with eclipse, and you dock your JUnit view in a convenient place, it's one button to click to rerun your tests, and see the report immediately in the same view.
It's a very efficient way to perform test driven developpement.
You could use Sonar with Hudson to do this for you. In the company I work in, we use this combination to solve your issue. There's also a Sonar plugin for Eclipse. But, if you look for something like an IDE-based solution I can't help you.
I'm working on an existing Java EE project with various maven modules that are developed in Eclipse, bundled together and deployed on JBoss using Java 1.6. I have the opportunity to prepare any framework and document how unit testing should be brought to the project.
Can you offer any advice on...
JUnit is where I expect to start, is this still the defacto choice for the Java dev?
Any mocking frameworks worth setting as standard? JMock?
Any rules that should be set - code coverage, or making sure it's unit rather than integration tests.
Any tools to generate fancy looking outputs for Project Managers to fawn over?
Anything else? Thanks in advance.
Any tools to generate fancy looking outputs for Project Managers to fawn over?
Be careful. A fancy tool for displaying metrics on unit test counts, coverage, code quality metrics, line counts, check-in counts and so on can be dangerous in the hands of some project managers. A project manager (who is not in touch with the realities of software development) can get obsessed with the metrics, and fail to realize that:
they don't give the real picture of the project's health and progress, and
they can give a completely false picture of the performance of individual team members.
You can get silly situations where a manager gives the developers the message that they should (for example) try to achieve maximal unit test coverage for code where this is simply not warranted. Time is spent on pointless work, the important work doesn't get done, and deadlines are missed.
Any rules that should be set - code coverage, or making sure it's unit rather than integration tests.
Code coverage is more important for parts of the code that are likely to be fragile / buggy. Acceptable coverage levels should reflect this.
Unit tests versus integration tests depends on the nature and complexity of the system you are building.
Adding lots of unit level tests after the fact is probably a waste of time. It should only be done for class identified as being problematic / needing maintenance work.
Adding integration level tests after the fact is useful, especially if the projects original developers are no longer around. A decent integration test suite helps to increase your confidence that some change does not break important system functionality. But this needs to be done judiciously. A test suite that tests the N-th degree of a website's look and feel can be a nightmare to maintain ... and impediment to progress.
Concerning the unit testing framework, there are mainly two of them : jUnit and TestNG. Both have theuir advantages, and both are equally performant. The main dvantage of jUnit is (to my mind) its default incoproration of an Eclipse plugin allowing easy tests calling.
Concerning the mocking framework, I don't find them to be a required part of your testing approach. Of course they're useful, but they solve a specific purpose : testing a behaviour (as opposite to testing an interface - what jUnit allows. With mocking frameworks, you're able to test how a specific class implements a specific interface. Will you need it ? Obviously. Will you need it first ? I don't know.
Concerning the rules, the only one I've found to be useful is simple (as always) : "always test code that broke at least once.". Consider your bug tracker. Each time a bug is encountered, there must be a unit test ensuring there is no regression. It's, to my mind, the faster way to have quality code.
Concerning the fancy- and efficient - output, I can recommend you enough to install a continous integration server (Hudson, obviously). It will run all your test suite each time code is commited, to ensure there are no side effects. it will generate graphs shoiwing the number of test run, and so on. it also can integrate code coverage tools and graphs. This continuous integration server will really become fast your testing buddy.
This is a complex question, so just a few notes about our practice at $work:
JUnit is indeed still the standard. Most documentation and literature treats JUnit.
Mockito seems to be the new star in Java mocking, although we still use JMock and think it's fine for our needs.
We use the EclEmma Eclipse plugin for checking our test coverage, and like it.
If you haven't done so already, read Working Effectively with Legacy Code by Michael Feathers.
I've been retrofitting unit tests to a C++ project and it is not pleasant.
First thing I did was to identify where most of the 'action' occurs. Then use that to start putting unit tests on the functions that can be test easily.
Then once you have the easier ones you can start looking at expanding the coverage virally - attack the functions that have fewer dependancies, run through them a few times in a debugger seeing what values are passed in and then write unit tests with those values to make sure you don't break anything.
Don't expect a quick fix - it's taken 3 weeks (6hr days, 5 days a week) to get 20% coverage but the code spends 80% of the time in that code so I think it has been time well spent and has uncovered quite a few bugs.
Regarding test coverage, I think that when you're bringing in unit testing to an existing project it's too early to start setting coverage expectations. You should start by ensuring that you actually can integrate the test framework and get reports from the coverage tools. Once you've done that you can start monitoring coverage, and then you can consider targets.
I'm currently developing two Java networking applications for school projects. One over TCP and the other one over UDP. In both I have to implement simple custom protocol.
Even though I'm trying pretty hard, I can't find a way how to correctly test this kind of apps, or better develop with test first development.
If I have a client and I want real test without stubbing everything out, I have to implement server with simulated behaviour, which in case of simple apps like these is almost the whole project. I understand, that when something big, than writing few lines of Perl script to test it could really help.
Right now I'm developing server and client simultaneously, so that I can at least test by hand, but this doesn't seem like a clean way to develop. The only thing that is helping is tunneling the connection through logger, so that I can see all the data that goes through (using TunneliJ plugin for IDEA).
What is the best way to TDD a networking application with custom protocol? Should I just stub everything and be fine with it?
Separate the protocol from the network layer. Testing the logic of the protocol will become easier once you can feed it your own data, without the need to go through the network stack. Even though you are not using Python, I'd suggest to look at the structure of the Twisted framework. It's a nice example of how to unit-test networking applications.
We wound up with the same problem a while ago. We decided it was simpler to put two developers on the task: one to write the server and one to write the client. We started working in the same office so that we could code, test, modify, repeat a little bit more easily.
All in all, I think it was the best solution for us. It gave us the ability to actually test the program in conditions there were not ideal. For instance, our Internet went out a couple of times and our program crashed, so we fixed it. It worked rather well for us, but if you are a sole developer, it may not be the solution for you.
Whatever you do, when writing a custom protocol, I would check out Wireshark for monitoring your network traffic to make sure all of the packets are correct.
In my app I have code such as this
m_socket.receive(packet);
doSomething(packet);
I mock up the receive and hence can exercise everything that doSomething() needs to do.
Where does this break down for you? Here you are truly unit testing that your code behaves correctly, you can also mock the socket send, and se expectations for what you think should be sent according to your protocol.
We are of course not actually testing that the other end of the protocol is happy. That's integration testing. I always hanker after getting to IT as soon as possible. It's when you interact with the "other end" that you find the interesting stuff.
You are in the luck position of being in control of both ends, in that position I would probably spend some time instrument to create suitable, controllable test harnesses.