I'm trying to find the best way to create automated testing for functional/acceptance/regression testing for some java applications. All the applications work in this way:
They read a File from a given folder
They write a new file in another format with the content of the input file.
They send to database some of the information of processed files.
They wait until a new file is left in the input folder.
This is a cyclic application, it never stops.
New files/formats are added continuously and several of our libraries are shared by all the formats. Manual testing is taking more and more cost with each new format. All the files are plain text files but with different format in the way data is saved.
We need a way/tool that could help us to automated the functional/acceptance/regression tests (specially QA tests).
The question is: What tool/way of testing can be used for this?
I was thinking in something that can left files in the input folder and compare what the application creates in output folder with an expected file. I donĀ“t know if this can be done easily with a tool or if we have to make all of this entirely.
I would use a generic functional test automation framework and use a set of libraries to read/parse/compare files. I am familiar with Robot Framework and there are some Python Libraries to read/compare files (some embedded in Robot itself, some elsewhere). That is very convenient and quite easy to use for QA Tests. Check out the demo project for a good start.
If you prefer to stick in the Java ecosystem, you might want to try Cucumber-jvm or JBehave.
Related
I'm trying to set up some initial tests for a JavaFX application, but specifically, I want to try and test the TreeView component, and I'm wondering how best to isolate access to the file system so that it can be tested (without having a dependency on my own file system). This is the project that I'm trying to test: https://github.com/mfearby/magnificat (it's a very basic thing at the moment and all it does is show a TabPane with some TreeViews pointing to the file system).
When you change the selected tab, it writes to an INI file so that when the app loads, it can restore those tabs. So I'd like to be able to test that this works, but it would depend on my own file system structure. The only testing I've ever done is with RedGate SQL Test, so I've never tested a GUI app, so please forgive the newbie question.
Should I somehow alter my FileTreeItem class which uses java.io.File.listFiles() so that it uses a virtual file system when testing (perhaps with Jimfs)? If I do this, how do I modify my class so that it uses Jimfs when testing but java.io.File when not in testing? Should I set a global variable (eek!) and act accordingly within my FileTreeItem class? That sounds awful but I'm scratching my head just wondering how best to test file system access without creating an absolute abomination.
Is it right to even modify the application code in such a way to accommodate a testing tool (apart from good coding practises in general) or should it be up to the test framework to do all the fakery to get the app to function in a testable way?
Any advice or pointers to "the right way" to do this would be much appreciated.
Thanks.
Ask yourself what it is you're unit testing.
Are you actually unit testing the writing to file? Why? You shouldn't test the JVM, but trust that it has already been tested.
So if not that, then what are you testing?
The file format? I.e. what you write get be read and understood? That part can be unit tested by using ByteArrayOutputStream and ByteArrayInputStream instead, so refactor the code so you can test the writing/reading logic using byte streams when unit testing, but file streams when running code.
I am writing unit tests for my Java program. My program does a lot of things that involve reading from a file that the user inputs and creating a new file based on the contents of the inputted file.
Currently, my unit tests are using premade files made for testing that I've placed in the resources source directory. When it creates a file, it places it in the same resources source directory.
I've looked at a lot of questions and answers on here, and there are so many varying opinions on how to handle files in unit tests. Is it proper to use these premade files within my unit tests, or is there a better solution?
Yes, the standard practice is to place your static input files for testing in the src/test/resources directory. If you main code generates output, the cleanest place to put it during testing is in your platform-specific tmp directory. In Java, this is under the System property java.io.tmpdir:
System.out.println(System.getProperty("java.io.tmpdir"));
If you use this directory, it will be portable between developers running it under MacOS, Linux, or Windows - as well as any build server.
It is also a good idea for your test to delete these temporary files when the test is complete, and for your test to not make any assumption about temporary files that may exist from a previous test run.
Whether or not static "premade" files are appropriate largely depends on the logic you're trying to test and how you want to manage your test data. Perhaps you need test data that contains a date that is relative to today's date. In this case using a static file could be more trouble than writing code to calculate a relative date.
As JB Nizet pointed out in his comment it's generally not a good idea to store test data alongside your source code. I'm not sure if you're familiar with maven and its standard directory layout but you might take a look at it and see how it helps keep the test and source files separated.
So I have been tasked with integrating a program called "lightSIDE" into a hadoop job, and I'm having some trouble figuring out how to go about this.
So essentially, rather than a single JAR, lightSIDE comes as an entire directory, including xml files that are crucial to its running.
Up until now, the way the data scientists on my team have been using this program is by running a python script that actually runs an executable, but this seems extremely inefficient as it would be spinning up a new JVM every time it gets called. That being said, I have no idea how else to handle this.
If you are writing your own MapReduce jobs then it is possible to include all the jar files as as libraries and xml files as resources.
I'm one of the maintainers for the LightSide Researcher's Workbench. LightSide also includes a tiny PredictionServer class to handle predictions on new instances over HTTP - you can see it here on BitBucket.
If you want to train new models instead, you could modify this server to do what you want, drawing clues from the side.recipe.Chef class.
I want to create a command line scaffolding utility for a java based test automation project, the way it happens with Rails or the play framework. I don't have much idea about how exactly rails or the play framework does that.
What I want to achieve is to basically:
Generate a set of standard folders with some generic .java files and
other framework utilities.
Add reference to all the required JAR
files in the project.
Create configuration files
I think this could simply be done by writing a shell script. But before going for one, I would like to know if there's any other better way to create such a utility. Any suggestions??
I just got a requirement to create a small (I assume standalone) utility to hit some code in our web application to do some custom processing of files from the app and then dump the files into a shared drive. My question is what is the best way for doing this? Do I just create a small app and then jar it up and run it off a command line or is there a better way?
Sorry, I didn't give enough detail. It's an old application, like over 10 years, so while it's been upgraded to jdk 1.6, most of the code uses the old collections, old loops, etc... There aren't any interfaces, very tightly coupled code that uses inheritance with lots of nested objects. The web app will do the processing. I think what they want is create some code outside of the application code that will login and then fire off the file processing code. Prior to this I had upgraded their version of Windward Reports in a separate branch and they want to make sure that the processed files: contracts, forms, etc.. don't get altered greatly as there are legal requirements on fonts and layouts. So this utility will go in, fire off the list of reports (a few thousand) dump it to a share drive so they can view them with another tool for comparision based on rules you can automate with that commercial tool, en masse. I was thinking create a small class with a main method, then jar it up and while the web server is running with my upgraded branch code, run the utility off the command line to fire it off.
There's not enough to go on here. How is the web app's functions exposed? If it's a REST interface then wget/curl/spring-rest-template are the way to go. If it's something like a JFS app then you're going to need something like Selenium to imitate a browser. If the functionality is in a shared library (JAR) then there web never even comes into play.
Well, I was originally looking at creating a standalone utility jar that I would run off the command line to connect with URLConnection to the app, but I found there is already testing code built into the application that I can run from a command line as long as I deploy the new code with the existing code. The utility will dump out the files to a shared drive and then XTest can be run to compare files. After reviewing the capabilities of XTest, it appears that it can handle the comparison of files well.