we will have large files (up to 2 Gb) files on a web page, and want to have the functionality that the user can continue a download if it gets interrupted.
At the moment, the only solution i can come up with is a java applet, i have tried searching for any existing open source projects with this functionality but havent found any so far.
Would be thankful for any tips how to achieve this, or pointers to existing projects or documentation that can be useful.
I am open to any solutions, so it does not have to be java applet (important is that it works in the most common browsers)
You could serve it as a torrent instead of a simple file and let the user's BitTorrent client figure it out.
Assuming you don't want to do that, HTTP lets the client specify a range of bytes to download. The user's browser has to be able to recognize that the file the user wants to download is the same as the one that already exists in partial form and send the proper range headers to the server, and server has to honor them. You can probably take care of the server side, but the browser will have to hold up its end. Some "download manager" programs do this too.
For more information:
http://www.west-wind.com/Weblog/posts/244.aspx
What you need is a Java Download Manager
1. An open Source project exist in google
http://code.google.com/p/jdownloadmon/
HTTPDownloadConnection.java#method connect
2. What partial content range types this server supports
Accept-Ranges: bytes
http://en.wikipedia.org/wiki/List_of_HTTP_header_field
Accept Range will allow the user to pause and resume download.
3.http://stackoverflow.com/questions/3414438/java-resume-download-in-urlconnection
Related
I have a Java web app that deploys as a WAR to Tomcat 7.0.41 (myapp.war). I noticed that when I deploy the WAR to a Tomcat that lives in one part of our network, the web pages display perfectly fine. However, and this only happens in IE 11, if I take the exact same WAR and deploy it to the exact same version/Chef-configured-instance of a Tomcat server that lives in another part of our network, the page stylings look way different and completely wrong. Again, this is specific to IE11 and the location in the network that the app is served from. If I go to the app in IE 11 from a "good" location on the network, the frontend renders perfectly fine. Or if I view the app from a "bad" location on the network, but in a non-IE browser, again all is well.
I have a feeling that we might have some IT proxy (nginx, etc.) that is preventing Tomcat from serving certain CSS/JS files, and so the end result is a partially-complete frontend that looks all wonky in the browser. And somehow, this only crops up in IE 11.
I have (sort of) confirmed this by viewing the source of all my HTML, JS and CSS files and copying them to files in a local folder. I then open up one of the HTML files (locally) in a browser and the site displays perfectly.
The problem here is that my JS files use a bunch of open source JS libraries. And those libraries have dependencies on other libraries. So on and so forth, and the dependency graph is pretty huge. It's tough for me to tell which files are not being downloaded properly/completely.
Here's the kicker: if I add in html5shiv to my app then the problem goes away entirely, no matter which browser (IE or not) or what location in the network I choose. However adding html5shiv breaks other things in my app, and for reasons outside the context of this question, can't be used.
Anyone have any idea how I could troubleshoot/fix this? Why would this only be affecting IE 11 and not other browsers? Why is html5shiv solving this?!?
You need to start using Wireshark.
What it does is capture all network traffic and allow you to view it exactly as it was sent/received by your network card.
What I would do is capture the complete traffic that occurs between your computer and the server in the location where it is working, when you visit the webpage that has the problem. Then repeat that for the server that is not working.
You will then have the complete traffic and can compare them side by side. Even if it doesn't tell you the cause of the problem Wireshark will tell you where the difference is occurring in the packets that are sent by the two different servers.
You could also do it the other way round by running TCPDump (with command like tcpdump -i eth0 -w file.cap -s 0 to get the complete packets, rather than just the first X bytes) on the server, to capture the packets sent, and then viewing the capture in Wireshark.
"Does Wireshark offer such file-level abstractions or is it all nitty-gritty, byte-level output I need to read?"
Kind of both. Basically once you have the stream in front of you, you are able to see the individual requests starting by looking for GET entries in the packets.
Once you've identified where a file starts, you can right-click on that packet, choose follow TCP stream and it will give you a summarised view of that TCP stream:
If you need the detailed difference between the files, it will be there....but tbh it's probably going to be something obvious like a file being completely truncated or mangled, rather than just a byte or two being wrong in one of the files.
I need to load somehow the html code of a webpage A into a javascript string of another webpage B, on a different host. I know this is impossible to do with javascript alone because of the same origin policy, and I know I could do it loading the page via php on my server and then send results back to the user's client but I wouldn't be able to handle so many requests, so I need it to be done directly by the user's browser. I can use nearly whatever browser scripting language/applet framework common enough to be installed on the majority of my users' computers, like flash and java.
On example, what if I use flash or java to load the external html code and then call a javascript callback function providing the source? Could this work?
Do you have ANY idea? I gratefully accept any suggestion, and I REALLY appreciate examples!
Thank you very much!
Matteo
It would require a digitally signed and trusted applet in order to reach cross-domain, short of the user running a plug-in 2 architecture JRE and the site implementing a Cross-Domain XML.
Ordinarily, unsigned Java applets or applications may only connect
back to the web server from which they originated. This restriction
prevents an applet from causing denial-of-service attacks by making
rapid connections to arbitrary machines on the Internet.
In Java SE 6 update 10, both the Java Web Start and Java Plug-In
technologies contain preliminary support for cross-domain policy
files, which specify how unsigned code may access web services on the
Internet. The crossdomain.xml policy file is hosted on a given server
and allows either selected clients, or clients from anywhere, to
connect to that server. Cross-domain policy files make accessing web
services much easier, particularly from unsigned applets.
"via php on my server and then send results back to the user's client but I wouldn't be able to handle so many requests"
So many requests? That is not so many reqs; just making php script to read couple page and creating new page depending the data. If that is too much for your server ..hard to believe.. you sure can do that kind of thing with flash (clients computer) to load those two pages, and parse the datas to one html page and display it (via js) to clients browser. Kind of weird question after all.. perhaps i did not understand it :)
see, i am developing a web application that downloads files from a server via http requests, but in a case the file isn't in the server but in the applet itself, i mean, some binary content is dynamically generated inside the applet and it must be downloaded. Of course i can use the java libraries to save the file in the client file system (if the applet is a signed one), but i was wondering if it can be done connecting the java OutputStream to the browser's download window, in other words, start a download from an applet.
Am i a crazy person ?
by the way, is it possible to do something similar from javascript ?
No, it is not possible to get around security by attaching the output of an applet to the standard file download mechanisms of a browser.
OTOH, since the Next Generation Java Plug-In, it is no longer necessary to have a signed and trusted applet in order to save files(/bytes) to the local file-system. Chase the links in the Applet info. page for more details. For a demo. of using the JNLP API services (that the plug-in2 hooks into for this functionality) see the File service demo..
You can if e.g. you upload the file to the server and then force browser (via LiveConnect or otherwise) to open that file from the server.
As far as I'm aware, there's no cross-browser way to emulate downloading from within an applet. So you should create that download yourself, and let browser do what it does best.
Obviously, it might well make sense to move the actual creation of the stream to your server side.
One of my requirements is, on load of page, a file is to be created dynamically and downloaded at a particular location on the client's machine.
In case the file is already present, it has to be over written.
is there any way where we can access the client's system and store the file at the required folder?
I feel one cannot access the client machine when the code is being executed on the server..
Senorio:
1-User click on generate document then it took template stream data ,req. data file and then save two file into client machine.
2-After that template open and it fetch the data file from same directory.
Please help me on this. This is an SOS!!
There are probably other solutions, I use a signed applet for this purpose.
As always, there are a few caveats though:
You can't "force" anything against the will of the user. Applets may be disabled in the client's browser, or they may not even have Java installed. Or the target directory might not be writeable by the user.Your server should handle cases where the client doesn't have the correct version of the file gracefully.
You can't do this from the server side obviously but you also really can't do this from a client script either. Browser security will prevent a page script from modifying contents of the file system.
Your only options will be to run a third-party browser plugin software that has elevated permissions.
Examples of such are
Java Applets
Java WebStart
Microsoft Silverlight
ActiveX
Each one is different and most require some level of user interaction to confirm that they allow plugins to run with elevated security.
We have a web application that allows user to download a zip file from a web server. We just provide dummy iframe source to the full URL of zip file on web server. This approach would allow end user to use browser controls which allows the user to open or save the zip to user's local machine.
We have a requirement that the zip file is automatically extracted and save to a specific location on user's machine. Any thoughts on how this can be achieved?
Thanks.
I highly doubt that you'll be able to do that. The closest you're likely to get is to generate a self-extracting executable file (which would be OS-dependent, of course).
I certainly wouldn't want a zip file to be automatically extracted - and I wouldn't want my browser to be able to force that decision upon me.
Short answer is I don't believe this is possible using the simple URL link you've implemented.
Fundamentally the problem you have is that you have no control over what the user does on their end, since you've ceded control to the browser.
If you do want to do this, then you'll need some client-side code that downloads the zipfile and unzips it.
I suspect Java is the way to go for this - Javascript and Flash both have problems writing files to the local drive. Of course if you want to be Windows only then a COM object could work.
Instead of sending a zip file why don't u instruct the web server to compress all the web traffic and just send the files directly?
See http://articles.sitepoint.com/article/web-output-mod_gzip-apache# for example.