large applets large userbase low bandwidth - java

I've got an applet that is rather large (4MB) on a web page used by a few thousand users scattered across a WAN. Bandwidths to these users range from a paltry 128Kbps to 10Mbps.
The problem occurs when a new version of the applet is made available; it is downloaded automatically by all the users browsers effectively chocking the network.
They really hate 'release day' morning around here :)
Is there any strategy to work around this problem?
Edit: I can only serve this applet centrally from one pair of servers. I cannot make any modifications to the hosting or network infrastructure.

Here are some ideas:
Divide your user community into N equal groups, and provide a different applet URL for each for group. Then stagger the times at which each group's copy of the applet is updated.
Put the applet on a server that has been tweaked to lie about the applet's modification date, and use this to (crudely) throttle the rate at which browsers fetch the updated applet.
Push the applet to locations on the local networks of large groups of users. Have the central server issue HTTP redirects so that each browser picks up the applet from a "close" location.
Deploy caching HTTP proxies & autoproxy files on the local networks, and block direct access to the applet forcing the users to get the applet via the proxies.
The last option is probably the best.

Apart from what Stephen C mentions, I'd like to add another strategy that you should consider.
Consider breaking your applet codebase into different modules (archives)
These modules are updated instead of the entire 4MB applet. You can have a special classloader that checks if a new version of a given module is available.
For patches have a separate "patch" archive that loads before any other archives so that any updated classes are loaded from the patch archive instead of the older already downloaded archives.
The java web start already does some of these things to avoid entire updates. You can take a look the link (developer documentation) for a few pointers.

If you can put a .htaccess in the directory you can add expiresByType so that the client doesn't ask the server everytime.
Have you looked at pack200? and (pack200 + .htaccess)
Have you looked at indexed Jar?
Anthony

Related

What a webpage java applet can access on my computer?

So, how much trust do I need to have in a publisher before I run their applet in the web browser?
In other words, I understand that a java applet is run in a sandbox in the browser, but this article suggests that the applet can actually access files stored on the local computer.
Can you please clarify the security limits of a java applet run in a modern browser, such as Firefox 50?
I understand that a java applet is run in a sandbox in the browser, but this article suggests that the applet can actually access files stored on the local computer.
There are potentially three different levels of security available to a Java applet.
The first is as you described 'sandboxed'. They can only access resources from their own server, nothing on your local file system unless they are launched using Java Web Start & will thereby have access to the services of the JNLP API. You might note that two of the services are the FileOpenService / FileSaveService! If the applet goes to use these, the end user will be prompted to permit the action via a dialog that states what the applet is trying to do, and asking for permission to proceed (to show a file chooser & go from there). These services provide back a 'file like' object that is more limited than the normal File API would supply. For example, it will not provide the path to the resource, just it's name and access to the content.
The level up from that can be specified in the launch file - '(J2EE) application client permissions'. This level removes the prompts for use of the JNLP API services.
The highest level of access is obtained by requesting, and being granted, 'all permissions'. Then the applet should have full access to File objects, be able to communicate with servers other than the one that launched it, etc. One of the few things they would still not be permitted to do in this mode is to call System.exit(n) to effectively 'kill the JRE' - this is something that is commonly done in other desktop apps.
But then there are JRE bugs, that screw all that up. Sun, then Oracle, kept stuffing up security so poorly (& regularly) that many browser manufacturers are entirely removing the support for applets (and other embedded objects requiring plug-ins) in web pages.
See Java Plugin support deprecated and Moving to a Plugin-Free Web for more detail.
..how much trust do I need to have in a publisher before I run their applet in the web browser?
I cannot answer for you, but my take would be that I would need to know them personally, and trust completely both their integrity and competence before I'd run their code in any browser I controlled.
Having said that, I don't think I have a single browser installed that even supports applets, and my complete lack of motivation to set something up, is probably a good view on whether I'd allow applets to run on this PC at all.

How do I deploy and manage a small scale Java desktop application?

Some relevant background:
My application is a Java app compiled into a .exe via JSmooth. The anticipated user base would likely be a few hundred users, but could grow well beyond that, as it's a community specific application.
How it works:
2 .jar files, one that preforms initial checks, another with the meat of the application.
Ideally, the init jar displays the splash, checks the version in desktop.txt against server.txt, if they differ, it prompts the user to update.
What I need to figure out:
1) What is a cheap, scalable hosting service that I could use as the file host for updates?
2) How can I create an "updater" to actually preform the jar replacement? My current solution is simply writing an updater in Java, but I was hoping for something like the installers people are more familiar with.
All of the research I've done has resulted in lackluster results, as 99% of hosting searches result in site hosting results. I just need an update repository with reasonable security. i.e., decent DDoS resistance and not left wide open to the Internet.
Edit: formatting
Easy to do and very foolish cheap with Amazon S3 or Joyent Manta as both support time-limited signed URLs and headers (which can contain a SHA-1 of the file) to check to see if the update is needed before downloading
On startup your app would check the update URL to see if it has changed. If it has changed, download the JARs. Do this before the app loads classes from those JARs. Updating the updater itself will be trickier so consider that an update might need a new update URL to prevent expiry.

Best place for my Java WebStart application to store its files

I'm not really clear on how Java WebStart manages its files.
Should I trust that files saved from my code in the WebStart working directory "." should would stick around?
I'm wondering if a good alternative might be to use a folder under the user's home directory; or perhaps I should let the user configure the data directory to allow for another partition (for backups or space). Having to do all of this seems to be defeating the purpose of Java WebStart. At one point, I even deleted the web start application from the Java Console but I as still able to start the application with the network card disabled (offline).
I think I covered every lesson here but they just did not cover this at all.
The PersistenceService is part of the JNLP API. Here is a demo. of the service. which can be used by sand-boxed apps as well.

Running a Java application in a network location

We have a VPN network and at a central point we have kept a java application (.jar file). We are allowing users of the VPN system to use this application- What are the cons of using this solution?
As for pros -
Easy to update to a new version
Storing the files in relative location helps save files in a central location
EDIT
Is it possible to access the COM ports using JWS since our app runs inside a sandbox?
I think what you describe would work well with Java Web Start - advantages I can think of
reduced bandwidth usage (JWS will only download files if they have been updated, if not it will use a local cached copy).
possibility to use specific JVM parameters.
automatic check of the client configuration (for example, JRE version must be at least xxxx, if not download it).
There are probably more.
It's okay. For a customer project, we choosed this solution too.
Its okay, if you have a good bandwith and, on traffic producing apps, a low latency at runtime.
For a test, I build a Sql-Wrapper, to add a simulated latency to each call to our database. So we're got a feeling for the application at runtime without a real vpn connection.

Large File Download

Internet Explorer has a file download limit of 4GB (2 GB on IE6). Firefox does not have this problem (haven't tested safari yet)
(More info here: http://support.microsoft.com/kb/298618)
I am working on a site that will allow the user to download very large files (up to and exceeding 100GB)
What is the best way to do this without using FTP. The end user must be able to download the file from there browser using HTTP. I don't think Flash or Silverlight can save files to the client so as far as I know they won't cut it.
I'm guessing we will need an ActiveX or Java applet to pull this off. Something like the download manager that MSDN uses.
Does anyone know of a commercial (or free) component that will do that? We do not want the user to have to install a "browser wide" download manager (like GetRight), we want it to only work with downloading on our site.
Update: Here is some additional info to help clarify what I'm trying to do. Most of the files above the 4GB limit would be large HD video files (its for a video editing company). These will be downloaded by users across the internet, this isn't going to be people on a local network. We want the files to be available via HTTP (some users are going to be behind firewalls that aren't going to allow FTP, Bittorrent, etc.). The will be a library of files the end user could download, so we aren't talking about a one time large download. The will be download different large files on a semi-regular basis.
So far Vault that #Edmund-Tay suggested is the closest solution. The only problem is that it doesn't work for files larger than 4GB (it instantly fails before starting the download, they are probably using a 32bit integer somewhere which is exceeded/overflown by the content length of the file).
The best solution would be a java applet or ActiveX component, since the problem only exist in IE, that would work like the article #spoulson linked to. However, so far I haven't had any luck finding a solution that does anything like that (multipart downloads, resume, etc.).
It looks like we might have to write our own. Another option would be to write a .Net application (maybe ClickOnce) that is associated with an extension or mime type. Then the user would actually be downloading a small file from the web server that opens in the exe/ClickOnce app that tells the application what file to download. That is how the MSDN downloader works. The end user would then only have to download/install an EXE once. That would be better than downloading an exe every time they wanted to download a large file.
#levand:
My actual preference, as a user, in these situations is to download a lightweight .exe file that downloads the file for you.
That's a dealbreaker for many, many sites. Users either are or should be extremely reluctant to download .exe files from websites and run them willy-nilly. Even if they're not always that cautious, incautious behaviour is not something we should encourage as responsible developers.
If you're working on something along the lines of a company intranet, a .exe is potentially an okay solution, but for the public web? No way.
#TonyB:
What is the best way to do this without using FTP.
I'm sorry, but I have to ask why the requirement. Your question reads to me along the lines of "what's the best way to cook a steak without any meat or heat source?" FTP was designed for this sort of thing.
bittorrent?
There have been a few web-based versions already (bitlet, w3btorrent), and Azureus was built using java, so it's definitely possible.
Edit: #TonyB is it limited to port 80?
Please don't use ActiveX... I am so sick of sites that are only viewable in IE.
My actual preference, as a user, in these situations is to download a lightweight .exe file that downloads the file for you.
Can you split the files into pieces and then rejoin them after the download?
If you don't want to write java code in-house, there are commercial applet solutions available:
Vault
MyDownloder
Both of them have eval versions that you can download and test.
A few ideas:
Blizzard use a light-weight .exe BitTorrent wrapper for their patches. I'm not entirely sure how it is done, but it looks like a branded version of the official BitTorrent client.
Upload to Amazon S3, provide the torrent link of the file (all S3 files are automatically BitTorrent-enabled), plus the full HTTP download link as alternative. See S3 documentation
What about saying "We recommend that you install Free Download Manager to download this file. You will have the added benefit of being able to resume the file and accelerate the download."
Personally I never download anything using the built in browser download tool unless I have to (e.g. Gmail attachments)
#travis
Unfortunately It has to be over HTTP inside the users browser.
I'll update the question to be more clear about that.
#levand
The problem only exist in IE (it works in Firefox) so while ActiveX would only work on IE, IE is the only one we need the work around for.
#travis - interesting idea. Not sure if it will work for what I need but I'll keep it in mind. I'm hoping to find something to integrate with the existing site instead of having to go out to a third party. It would also require me to setup a bittorrent tracker which wouldn't be as easy as it sounds for this application because different users will have different access to different files.
#jjnguy
I'm looking for a java applet or ActiveX component that will do that for me. These are non-technical users so we really just want to have them click download and the full file ends up in the specified location
#ceejayoz
I totally agree but its part of the requirement for our client. There will be FTP access but each user will have the option of downloading via HTTP or FTP. There are some users that will be behind corporate firewalls that don't permit FTP
I have seen other sites do this in the past (MSDN, Adobe) so I was hoping there is something out there already instead of having to make one in house (and learning java and/or ActiveX)
I say click-once installed download manager, similar to msdn.
But becoming a CDN without a more optimized protocol for the job is no easy task. I can't imagine a business model that can be worthwhile enough to have such large file downloads as a core competency unless you are doing something like msdn. If you create a thick client, you at least get the chance to get some more face time with the users, for advertising or some other revenue model, since you will probably be paying in the hundreds of thousands of dollars to host such a service.
The problem with the applet approach mentioned is that unless you have the end user modify their java security properties your applet will not have permission to save to the hard drive.
It may be possible using Java Web Start (aka JNLP). I think that if it is a signed app it can get the extra permission to write to the hard drive. This is not too different from the download an exe approach. The problem with this is that the user has to have the correct version of Java installed and has to have the Java Web Start capability setup correctly.
I would recommend the exe approach as it will be the easiest for the non-technical users to use.
There are some users that will be behind corporate firewalls that don't permit FTP...
Are users with restrictive firewalls like that likely to be permitted to install and run a .exe file from your website?
Take a look at cURL. This article describes how to do a multi-part simultaneous download via HTTP. I've used cURL in the past to manage FTP downloads of files over 300GB.
Another tip: You can boost download times even more if you increase the size of the TCP Window on the client's NIC configuration. Set it as high as the OS allows and you should see up to 2x improvement depending on your physical network. This worked for me on Windows 2000 and 2003 when FTPing over a WAN. The down side is it may increase overhead for all other network traffic that wants only a few KB for a network packet, but is now forced to send/recv in 64KB packets. Your mileage may vary.
Edit: What exactly is this you're trying to accomplish? Who is the audience? I'm assumed for a bit that you're looking to do this over your own network; but you seem to imply the client side is someone on the internet. I think we need clearer requirements.
Create a folder of files to be downloaded on the server where the document service is running (either using Linux commands or using java to execute shell commands)
Write the file to be downloaded to this folder (using Linux command or Java shell command is OK). Considering the efficiency of program execution, WGet command is used here
Package the downloaded folder as a zip file (using shell command), configure nginx agent, return the access file path of nginx to the front end, and then download from the front end.

Categories