I am working on a Java web app with unit tests that deploy the app in Jetty. I use HtmlUnit to hit the app and do some high level tests. I set it up so that I can use a singleton probe to modify my system configuration and add a "test" flag--This is handy because I want to be able to run some tests without having to authenticate an actual user or check user roles.
However, it seems like it could open the door for vulnerability when the app is deployed. I'm looking for suggestions about how to make this "back door" a little more bullet proof. I could use a mock object to handle this, but I think that still leaves the back door exposed.
I have user accounts specifically for testing in all of my environments. I create them using the real registration process, nothing hand-made.
This bypasses your issue, allows me to test the signin process, and if needed I create multiple users with different traits/roles which I can test against.
Because the users are under my control, they remain consistent and match the expected test results.
Use special parameter that is long enough to assume. For example GUID. It could be even hard-coded in your application. All tests will append this parameter to each URL they are using. You can check this parameter using special HttpFilter and turn the test mode on.
Throw some kind of security around the process you use to change the app over to Test mode - Basic Authentication for that page, or something. This can all be configured directly in web.xml.
Related
I want to add WebAuthN as an option for multi factor authentication to an Angular & Spring application. I use the WebAuthN java-webauthn-server library from Yubico.
What is the best way to integration test my WebAuthN server, without a hardware client? Is there any software that can handle the cryptography in an automated test? I want to run these tests automatically in the CI/CD pipeline (GitLab).
In a best case scenario I want to be able to test the whole process, creating credentials as well as logging in. An alternative scenario could be that I use known credentials in the backend and only log in with these.
My API is REST/JSON based, with relying party, user, challenge, pubKey etc...
My integration tests are Java based (spring boot starter test)
I am mainly interested in how to integration test the server without the client side. Are there utility programs or libraries that can handle authenticators and return the correct data/json objects?
I have looked at Testing WebAuthn via REST tool, however, I am not interested in testing the specification, since I am using a library, I only want to ensure that I applied the library correctly to my code.
If you are only interested in testing the server side, you can write a simple webpage with buttons that exercise your endpoints and call navigator.credentials.(create|get). You can then instrument a browser using Selenium 4+, set up Virtual Authenticators, and run tests against that webpage. Take a look at the selenium tests for an example. The code to set up the authenticators looks like this in java:
VirtualAuthenticatorOptions options = new VirtualAuthenticatorOptions();
options.setTransport(Transport.INTERNAL)
.setHasUserVerification(true)
.setIsUserVerified(true);
VirtualAuthenticator authenticator =
((HasVirtualAuthenticator) driver).addVirtualAuthenticator(options);
Pay attention to setting up the authenticator with the right settings to match your webauthn call. You should pick the right user verification support, resident keys support, and internal (i.e. platform) vs usb / nfc / ble (i.e. cross-platform) transport.
If you're using an older version of selenium, you'll have to manually define the commands yourself. The code should look like
browser.driver.getExecutor().defineCommand(
"AddVirtualAuthenticator", "POST", "/session/:sessionId/webauthn/authenticator");
// ...
Command addVirtualAuthCommand = new Command("AddVirtualAuthenticator");
addVirtualAuthCommand.setParameter("protocol", "ctap2");
addVirtualAuthCommand.setParameter("transport", "usb");
browser.driver.getExecutor().execute(addVirtualAuthCommand);
Running selenium tests might take a bit of work if you aren't already using it for integration testing. However, this implementation will very closely match reality. From the browser's perspective, the virtual authenticator is real hardware. The response from the authenticator will be processed by the browser as if it was real.
At the moment, only chromium based browsers support Virtual Authenticators.
I have a java application using oracle DB, running on apache tomcat. During normal day, the java app runs fine. However, the traffic was double on a day, and the app starts to encounter increase in response time and timeouts.
After that, we tried run load test using jmeter with the same amount of load experienced but never encountered any responsive/timeout issues from the testing. BTW, we checked our network monitoring tools, no issues with the infra.
Can I check what should I be looking for if I want to replicate the same issue during testing? Replicating this would help to ensure that the changes we are going to do would works.
Thanks!
The underlying question is: how to make the load test realistic enough to replicate slowness in accessing database observed in production. The thought process in our comments exchange was to review and rule-in or rule-out factors that are often not properly emulated in load tests and give over-optimistic performance result. I reviewed 4 factors:
Does the load test script correctly correlate dynamic values? Yes it does, because the records created by the load test match with the scenario. If this were not be the case, then failed transactions in the load test would be responsible for too fast response. The recommendation would be to correlate your script manually.
Does the load test script correctly emulate multiple authenticated users? Application does not require login. If this were not the case, then running load test with a single user will fail to test system overheads on maintaining multiple user sessions. The recommendation would be to parameterize recorded credentials using dataset with multiple credentials.
Does the load test script correctly emulate anonymous users authenticated through cookies? Application does not use cookie authentication. If this were not the case, then the recommendation would be to clear browser cache before recording to make sure that the stale cookie is not recorded and then make sure that cookie correlation in the script is configured properly.
Does the load test script correctly emulate data in the recorded scenario? Assuming the test scenario recorded some user entry that is used as a criteria for a database query. If you replay multiple iterations emulating the same entry, the database may not be hit for such queries due to application or database caching. If this is the case, then the recommendation is to parameterize user entries using test datasets.
If the last factor is also not the case, then there are more factors to go through. For more about correlation and parameterization load tests check this blog http://www.stresstimulus.com/blog/post/eradicating-load-testing-errors-1
I have a service that calls out to a third-party endpoint using java.net.URLConnection. As part of an integration test that uses this service I would like to use a fake endpoint of my own construction.
I have made a Spring MVC Controller that simulates that behaviour of the endpoint I require. (I know this endpoint works as expected as I included it in my web app's servlet config and hit it from a browser once started).
I am having trouble figuring out how I can get this fake endpoint available for my integration test.
Is there some feature of Spring-Test that would help me here?
Do I somehow need to start up a servlet at the beginning of my test?
Are there any other solutions entirely?
It's a bad idea to use a Spring MVC controller as a fake endpoint. There is no way to simply have the controller available for the integration test and starting a servlet with just that controller alongside whatever you are testing requires a lot of configuration.
It is much better to use a mocking framework like MockServer (http://www.mock-server.com/) to create your fake endpoint. MockServer should be powerful enough to cover even complex responses from the fake endpoint, with relatively little setup.
Check out Spring MVC Test that was added to Spring in version 3.2.
Here are some tutorials: 1, 2, 3
First I think we should get the terminology right. There are two general groups of "fake" objects in testing (simplified): a mock, which returns predefined answers on predefined input and stubs which are a simplified version of the object the SUT (system under test) communicates with. While a mock basically does nothing than to provide a response, a stub might use a live algorithm, but not store it's results in a database or send them to customers via eMail for example. I am no expert in testing, but those two fake objects are rather to be used in unit and depending on their scope in acceptance tests.
So your sut communicates with a remote system during integration test. In my book this is the perfect time to actually test how your software integrates with other systems, so your software should be tested against a test version of the remote system. In case this is not possible (they might not have a test system) you are conceptually in some sort of trouble. You can shape your stub or mock only in a way you expect it to work, very much like the part of the software you have written to communicate with that remote service. This leaves out some important things you want to test with integration tests: Was the client side implemented correctly so that it will work with the live server. Do we have to develop work around as there are implementation errors on the server side? In which scale will the communication with the remote system affect our software's performance? Do our authentication credentials work? Does the authentication mechanism work? What are the technical and conceptual implications of this communication relationship no one has thought of so far? (Believe me, the latter will happen more often than you might expect!)
Generally speaking: What will happen if you do integration tests against a mock or a stub is that you test against your own understanding of how to implement the client and the server side of communication, and you do not test how your client works with the actual remote server or at least the best thing next to that, a test system. I can tell you from experience: never make assumptions on how a remote system should behave - test it. Even when talking of a JMS server: test it!
In case you are working for a company, testing against a provided test system is even more important: if you software works against a test system and you can prove it (selenium is a good helper here, as well as good logging, believe it or not) and your software does not work with a live version, you have a situation which I call "instablame": it is immediately obvious that it is not your fault the software isn't working. I myself hate fingerpointing to the bone, but most suits tend to ask "Who's fault was it?" even before "Can we fix that immediately?" and way before "How can we solve that problem?". And there is a special group of suits called lawyers, you know ... ;)
That being said: if you absolutely have to use those stubs during your integration tests, I would create an own project for them (let's say "MyProject-IT-Stubs" and build and run the latest version of MyProject-IT-Stubs before I run the IT of my main project. When using maven, you could create MyProject-IT-Stubs with war packaging, call it as a dependency during the pre-integration-test phase and fire up a jetty for this war in the same phase. Then your integration tests run, either successful or not and you can tear down the jetty in the post-integration-test phase.
The IMHO best way to organize your project with maven would be to have a project with three modules: MyProject,MyProject-IT-Stubs and MyProject-IT(declaring dependencies on MyProject and MyProject-IT-Stubs. This keeps your projects nice and tidy and the stubs do not pollute your project. You might want to think about organizing MyProject-IT-Stubs into modules as well, one for each remote system you have to talk to. As soon as you have test access, you can simply deactivate the according module in MyProject-IT-Stubs.
I am sure according options exist for InsertYourBuildToolHere.
We are building a Spring-MVC web application for 80 000 users.
I see a lot of controllers in the petclinic example using :
#SessionAttribute annotation and
SessionStatus status ... status.setComplete() to store and remove the beans from the HTTP Session. Very useful indeed.
Is it the best way to go if you plan to build an application for 80 000 users ?
Could you still use session load balancing and session failover if you plan to store all your form data like this ?
It will probably not meet your needs, no. There are two principle problems with the built in implementation:
It doesn't really support tabbed browsing. If a user loads the same screen in multiple browser tabs, the two tabs accessing one controller are going to clobber each other's session attribute data.
If users don't follow your "planned" navigation path that setComplete() call will get missed and the object hang around indefinitely until the session expires and is cleaned up.
Number 1 may or may not be a concern depending on how your app is designed and what it does. (some things, e.g., Banks, deliberately thwart mult-tabbed usage anyway) But most users I think would expect to be able to edit User A's profile in one tab and User B's profile in another tab and not have submitting one form break the other screen.
Number 2 you could work around by always submitting a screen into its own controller then redirecting after cleanup, but that's a lot of work if you aren't already building that way.
The good news is org.springframework.web.bind.support.SessionAttributeStore is a recognized extension point! You can provide any implementation of that you like and inject it in your dispatcher servlet. You don't even need to use the Web Session to store information if you want to avoid bloating it up with business objects. You could put that actual storage in a backend terracotta cluster for example, and not worry about it being compatible with your clustering strategy.
--
And then there's always option Gamma if you really need true scalability: rework it into a RESTful strategy that doesn't rely serverside state in the first place :)
I'm using the Java Preferences API to store and retrieve small pieces of information in a swing/java application.
Now i have setup Java Web start to launch the application from my web page, and I get a security exception. In order to get rid of this exception, I'd have to prompt the user for permissions. And I refuse to do that because my application does nothing else that would require the user's permission.
That's why I need an alternative solution for storing a few key values from one execution to another. Some sort of cookie or whatever. Do you know any please ?
You may want to look into using PersistenceService, a feature of Java Web Start that "provides methods for storing data locally on the client system, even for applications that are running in the restricted execution environment." Related examples may be found here and here.