Scan dependency-jars for 'phoning home'? - java

Recently we found a jar in our application phoning home. See details: https://github.com/nextgenhealthcare/connect/issues/3431)
This is very undesired behaviour.
What would be the best approach to detect this in one of our depedencies?
ps. I can unzip all the jars and scan for HttpClient or UrlConnection, but there are many ways to connect to external systems, and preferably I don't want to reinvent the wheel.
I am aware of the OWASP Dependency-Check, but phoning home is not a CVE per se.

If you scan your jar's, and they do have network connectivity, then what can do then? You can't recompile the source, as you don't have it. A case of finding something you can do nothing about (apart from find an alternative).
The only way is it firewall your application, or network, use containers, and have a fine grained control of what you application talk to. Basically run your jars with zero trust!
I guess it really boils down to trusting your jar files, and that means in turn trusting the humans that make that everything that goes into jar file. (design, coding, build, distribution, maintenance ). The whole SDLC
If you approach the problem of zero trust, you can either get the JVM (security manager), The operating system (SELINUX/System Cap's/Docker) or the network (firewall/proxy/ids) (or all three) to control and audit access attempts..and either deny or permit these access depending on a policy that you set.
Scanning the jars for network calls can be done, but i'm sure if a jar really wants to obfuscate it's network behaviour, it will be able to, especially if it can run shell commands, or dynamically load jar's itself.
A jar you scan today, might not have the same behaviour on the next update? The classic supply chain attack.
If you don't trust you jar's, then if you must establish that trust, either thought scanning, auditing the source code.
There are many tools for this. I'm not sure if i'm allowed to recommend a particular product here that i've had success with, so i won't.

Related

Find all provided jars in Websphere

I'm using the Websphere Application Server 8.5.5.6 and 8.5.5.8 and from time to time run into problems when some jar or the other in my application conflicts with one that is already present on the WAS. It's easy to fix of course, simply mark the dependency as "provided" in maven and there you go, but since IBM seemingly choose to write the AS with the most obscure error messages possible it takes ages to find something like that out.
My question which google hasn't been able to answer so far:
Is there a complete list somewhere which libraries in which versions are provided with Websphere?
Assuming you're referring primarily to Open Source packages, the official list is here: https://www.ibm.com/support/knowledgecenter/en/SSAW57_8.5.5/com.ibm.websphere.nd.multiplatform.doc/ae/opensourcesoftwareapis.html
Beyond that, most of the stuff visible to apps should be Java EE/SE APIs, which I assume you were already expecting, and IBM-specific implementations (things in com.ibm.* packages), which are hopefully at low risk of collision.
At least if you are on Windows: take Process Monitor (not Process Explorer), and fire it up filtering on Path contains .jar. Then start WebSphere. At some point it will starting loading jars from various directories. Process Monitor will show you which are those jars, and where they are being loaded from.
This should provide you with first hand information without reading IBM documents.
Besides, probably you are aware of that, but in any case: you should be careful with marking a dependency as "provided", since the version of the library used by your application might differ from the version used by WebSphere.

Securing Java application with trusted and untrusted code

Suppose I want to ship a Java application that supports plugins. For security reasons, I want the first party code (i.e. my code) to run basically unrestricted, but I want to limit the operations that the plugins would be able to perform. What's the right way of doing this? Is using the SecurityManager appropriate? Secondarily, is there a way to enforce that the application is started with a particular policy file in place, or should I assume if the end user wants to hack another policy file into place, that's their prerogative (or at least, there's nothing I can do to prevent it?).
The reason I have my doubts about the SecurityManager is because the goal of the class seems to be to prevent the entire application from doing things the end user doesn't want, whereas I'd like to use one to manage subsets of the application, completely opaquely to the end user, if possible.

Can you remove sakai core tools you don't want/need?

Something I've been wondering recently, is it possible to actually "remove" core tools from the sakai vanilla build without a huge effort (editing loads of config files)?
I know about stealthing tools (https://confluence.sakaiproject.org/display/DOC/Provisional+Tools) and I "think" there's some way to "disable" tools (or is that just stealthing?), but simply to remove the possibility of potential problems and lower the service memory footprint + startup time it would be nice if there was a supported means to "not have X Y or Z tool in the service at all".
I've never tried just removing jars to see what happens, but I suspect that mightn't be a good idea and probably needs to be compiled up with tools deployed to the webapp directory, which I would think means changing a whole load of maven files to do a "mvn clean install sakai:deploy" that would be lighter.
The Sakai architecture is actually more akin to a lot of loosely (or tightly in some cases) coupled tools than a unified system. This is an advantage from the perspective that you can do exactly the thing you want to do here. It is a disadvantage from a unified user experience perspective (though that is not an architectural limitation but rather a side effect of how the tool teams were run early on in the project).
If you want to remove a tool (like Samigo for this example) then you can simply delete the war file (and directory) related to it from your TOMCAT_HOME/webapps directory. Run this from your tomcat home directory:
rm -rf webapps/samigo-app*
When you startup tomcat, the tool will not be loaded and things will work fine (assuming there is not another tool or part of Sakai that expects that one to be there). Some tools like resources (sakai-content-tool) should not be removed for that reason (though stealthing them would be fine).
Please note that only removing the tool will not save you as much as you might hope since there is also a service related to most tools which lines in TOMCAT_HOME/components. The service component is actually an exploded war file (basically the same as the tool webapp) but it has not interface and has to follow some Sakai conventions in order to load properly. In the case of Samigo again, you could remove it like so (running from your tomcat home):
rm -rf components/samigo-pack
You should NOT do this while the system is running. You should also NOT remove the API jars from shared.
When you restart Sakai after removing the component you will see a more significant drop in the resources since the tool service is no longer loaded in memory and initialized. I saw about 5 second reduction in startup time (90s to 85s) and about a 25MB reduction in JVM memory used (from 795 to 770) by removing Samigo and it's service.
Your best bet would be to "trial and error" out the optimal solution for your situation and try removing a tool and it's service (if it has one) and seeing if things startup without errors and if the tools you do use work as expected.
Also, please note that removing a tool will NOT remove the tool pages in existing courses. You will end up with a page which simply displays nothing (because Sakai sees it as an empty page in the course now). If you add the tool back into the system then it will appear on the page again.
UPDATE: If you want to remove the blank tool page there is one easy option. The easy option is to just go into the site and remove the page the tool is on. This can be done from the Sites admin tool.
Alternatively, you could go into the database and remove all the pages which contain the specific tool id. This is pretty risky though so I don't recommend it.
Generally, the removal of a tool like this would happen before the tool is used in production so hopefully this is a rare case.
After fairly extensive testing, this is what I've found you can remove related to VLE functionality, this probably won't apply to that many but it's useful if you purely want collaborative tools (for running a research VRE, or just a slimline tool provider):
Under tomcat webapps...
samigo (also make sure you remove the samigo folder under < tomcat root >/sakai/samigo)
presence (make sure you turn off presence in sakai.properties too though!)
sakai-podcasts
podcasts
lessonbuilder
osp (all of it i.e delete all wars referencing osp-*)
sakai-signup-tool (we don't have a need for this, but you might)
citations
polls-tool
sakai-gradebook-tool (DO NOT remove sakai-gradebook-testservice!)
grades-rest
dav (assuming you don't use webdav, make sure to turn off webdav in sakai.properties, we use shibboleth for SSO, so we can't currently use webdav... also the advent of the multi drag n drop + zip archive files/folders in resources has made webdav doubly needless)
sakai-syllabus-tool
sakai-reset-pass (Again, we use shibboleth SSO, so don't need password reset functionality)
DO NOT remove sakai-assignment-tool
sakai-postem
Under tomcat components...
samigo
presence
sakai-podcasts
lessonbuilder-components
osp (all of it)
sakai-signup
citations
polls-tool
(I haven't fully tested this, but it seems wise to NOT remove grade related directories)
sakai-syllabus-tool
DO NOT remove sakai-assignment-tool
Removing this stuff reduced my startup time by a couple of minutes and reduced my memory footprint on the server too (don't have exact figures for that)

How to sign multiple JNLP application using Maven

We use JNLP application for my business. Actual use requires manual signing jars for each release. This certainly leads to having different certificates, expired certificates and so on ..
We POC'ed maven to automatically sign an application with Maven Jarsigner Plugin.
Now, what is the best approach to industrialize such process ? I'd like to have the certificate shared among all applications instead of recreating one everytime.
In particular:
Is it correct to have a certificate for a bunch of corporate applications, or shall I consider having one per application ?
Can we imagine to store certificate(s) as dependencies (under business repo) and have both dev and release certificates fetched uppon build ? Say dev cert for local build and release certificate for release.
What are the flaws of such use ?
Is there any other/better solution ?
Thanks for your answers.
There are many ways to solve the problem, so I can only share my thoughts on the subject.
a) I would assume different releases would be on different branches, so in essence we only deal with one release version at a time
b) I then assume per version, that you have different certificates per environment. The per environment part can be handled using maven profiles (http://maven.apache.org/guides/introduction/introduction-to-profiles.html), so...
Whether to have multiple or a single certificate is a matter of preference. Since it provides the level of trust between any given user and the given app it is essentially a judgment of risk versus maintainability.
Risk, in that multiple apps with the same certificate gives higher exposure, also to malign exposure, and any breach of one is a breach of all. So it may be important what the certificates guard.. Maintainability in that all apps follow the same update cycle, and that a change to one means a change to all.
So, the coupling is a bit higher, risk is harder and maintenance is simpler.. If you were global enterprise Acme Inc risk is probably higher than if you were local enterprise Icme Inc. and would it be other peoples data or money that would probably invite the safest option available.
I see no reason why certificates cannot be stored. Either in repository or some other safe repo or simply lying around. What is more interesting may be the private keys, which you can specify as properties and have the dev ones bound it the dev profile and the release ones omitted, so you would have to provide them on the command line.
Assuming you use maven jarsigner plugin, you could have ${my.keypass} and ${my.keystore}, and then the dev profile with both properties set, and the release profile with only keystore set.
Last time I used certificates in a similar manner I had:
- a set of individual components
- In a single repository
- Which could be build as a single complete entity.
So sharing the certificates was an easy takeaway. All certificates except final prod was in sourcecode repository Certificate for releases was on a secure server, where we had a batch process which only a few had access to.
As for security compromises.. I don't think we ever encountered one, but we were prepared :)

Incremental deployment of java web applications

We have following problem. Developers frequently need to make small changes to our web applications. When I say small, I mean things like correcting the spelling on a web page or similar. Generating and redeploying war archives can be slow and costly in such scenarios.
How could we automate and install changes incrementally? For example, generate new exploded war, compare files with exploded war in production and then replace in production only the files affected by change: .jsp .html .class etc.
This need not be hot deployment, it’s ok to restart the server. What I wish to avoid is having to copy and deploy wars that can be 80Mb in size. Sometimes connections are slow and making such minuscule change to web application as simple spelling correction can take hours.
We use Maven to automate our build process. The key issue is to automate the whole process, so that I can be sure that app v2.2.3 in my Subversion is exactly what I have in production after incremental deployment.
We used to do this sort of thing all of the time. We worked in a bank, and there were sometimes changes to legal phrases or terms and conditions that needed to be changed today (or more usually yesterday).
We did two things to help us deploy quickly. We had a good change control and build process. We could change and deploy any version we liked. We also had a good test suite, with which we could test changes easily.
The second was more controversial. All of our html was deployed as separate files on the server. There was no WAR. Therefore, when the circumstances came up that we needed to change something textual quickly, we could do it. If java needed changing, we always did a FULL build and deploy.
This is not something I'd recommend, but it was good for our situation.
The point of a WAR is so that everything gets deployed at the same time. If you're using a WAR, that means you want it to be deployed all at once.
One suggestion is not to do such corrections so often (once a week?). Then you don't have so much pain.
Hard to say. You can ofcourse replace single class files in an exploded webapp, but this is generally a bad idea and you don't see many people doing this.
The reason is that when you make small changes it becomes harder and harder to detect differences between production and development. The chances of you sending a wrong classfile and breaking the production server increases over time.
When you say text changes, isn't it an idea to keep the text resources seperate from the war file? That way, not only developers but maybe even the customer can easily add/change translations.
To the customer it's important, but technically it's silly to do a 80MB deploy over a slow line to fix a small typo.
You can also try to look at your build/delivery cycle and increase testing efforts to prevent these small changes.
Hope this helps.
You can have the master war deployed somewhere the running servers can access it, and instead of deploying war files to the individual servers you can use rsync and perl to determine if there are changes to any files in the master war, distribute them to the servers and execute restarts.
diff and patch:
http://stephenjungels.com/jungels.net/articles/diff-patch-ten-minutes.html
At the moment I installed SVN on the remote server so in case of a simple udate you can just update single file. Transfering the big WAR file would be quite impractical.
You can automate to a single click deployment using putty / plink [if you are using windows] by creating a simple script on the local machine an another one in the remote machine.
At the moment I have a DEVELOPMENT SVN and a LIVE SVN. The ANT build is merging the DEV to LIVE and the commit again back to the LIVE repository. At that stage the remote server can do a SVN UP and you will get automatically the file requested.
You can furter improve the update script to restart the server in case some classes are changed and do not restart in case of updating scripts/JSP.
In this way you will have also the option to rollback to a previous version to be sure that you have a working web app all the times.
To improve the process of merging SVN this tool is quite useful. : http://www.orcaware.com/svn/wiki/Svnmerge.py
The usual answer is to use a Continuous Integration sstem which watches your subversion and build the artifacts and deploy them - you just want your web application to be abel to work even after being redeployed. Question is if that is fast enough for you?
I don't think there's a straightforward answer to this one. T
The key here is modularisation - a problem which I don't think is solved very well with Java applications at present. You may want to look at OSGi or dynamic modules lathough I'm not sure how effective they are in terms of this problem.
I've seen solutions where people drop classes into application server/servlet container, I don't agree with it, but it does appear to work... I'm sure there are horror stories though!
Maven certainly makes things easier by splitting applications into modules, but if you do this and deploy modules independently you need to make sure that the various versions play nice together in a test environment to begin with...
An alternative is to partition your application in terms of functionality and host separate functions on various servers, e.g:
Customer Accounts - Server A
Search - Server B
Online Booking - Server C
Payment Services - Server D
The partitioning makes it easier to deploy applications, but again you have to make sure that your modules play nicely together first. Hope that helps.
I have had a similar situation before. It really is a separation of concerns issue, and it's not too straight forward. What you need to do is separate the text from the template/HTML page.
We solved this by placing our text in a database table, and using the table as a message resource - the same way people use myMessages.properties for internationalization (i8n). This gives you two advantages, you can i8n the text, and make changes in prod instantly and easily without a code deployment. We also cached the table to ensure performance didn't suffer much at all.
Not a solution for all, but it did work really well for us.

Categories