I got a form with SWFUpload.
Files uploaded to the server are converted server-side (video being compressed, images being resized etc)
Question is - can i delegate some of the functionality to client-side (like image resizing), to save some bandwidth for user.
Video compression via Javascript would probably slow the browser so much that it wouldn't be bandwidth savings and if anything would probably annoy the end user.
Related
Our site have a posibility to load user pictures. User can load absolutely any file(we believe that it will a picture at the moment). On page we show a lot of pictures thus we have a huge internet traffic. we want to compress pictures on server. I have found following article about picture compressing.
How to compress jpg file?
But there are explained how to compress concrete format. Is there universal way for compressing any picture?
A picture is a very broad term that could also include vector graphics, pictures that are in a format that browsers usually can't display (like psd files).
I would recommend to use the gd lib or imagemagick on the server side to convert them all (or at least these that are supported by your choice) to a standarized format (like jpg) that can be displayed in a webbrowser and get compressed (maybe also resized).
100% of possible image files a user could upload can't be converted afaik.
I'm developing a download-anything app for Android and it works fine in most cases.
I have come across sites that have URLs with a long hash signature (it seems) at the end of it. But the standard video app for Android, and my web browser is able to play it directly, streaming.
I have no clue as to how to stream this to a file (progressive download?), which should be possible. The URL paramater after '?' is used for something. As Jessica pointed out the URL below is probably used for RTMP streams with rtmp://....
URL example (host domain edited out):
http://blush.im.54ca3830.919727.x.yesitisporn.com/videos/3gp/d/b/f/
filthysite.com_dbf7f0a9c3913d4d0e09a36fe8ab3aba.mp4?e=1348368010&ri=1024&rs=85&h=c81c6707b13714ac65b651ba2939d94a
In the URL above there is a link to an mp4 video file. Trying to download it with this shorter URL does not work: http://blush.im.54ca3830.919727.x.yesitisporn.com/videos/3gp/d/b/f/
filthysite.com_dbf7f0a9c3913d4d0e09a36fe8ab3aba.mp4. Returns an empty document.
Since popular video apps and browsers pick up these types of HTTP links just fine for playback; there should be a standard way of getting the byte stream and write it to file. Thanks for any help!
In response to the question as originallly posed:
It is quite common to add URL parameters, splitting the url from the parameters with a question-mark, and seperating the parameters with ampersands. Take the substring on everything up to the first non-esecaped question-mark in the url, if a question-mark is present, otherwise use the entire string.
Based on new feedback:
Like I said in my comment, and as confirmed by your tests without the parameters, I think you're barking up the wrong tree to try to change the URL. I would suspect the reason you can't save these specific streams is there is something different about the file format or server configuration that is different than the ones that work. In particular, my first thought would be that perhaps those URLs are served by a Streaming Server (Example: Icecast), and not a normal file-based HTTP server. Advanatages of a streaming server include being able to on the fly serve different bandwidth versions of the streams, and instant seeking to any part of the file and so forth, but the downside for people trying to build download anything applications is those servers don't send the data as a single file, they send the data in chunks--trying not to get too crazy technical, basically, a chunk might have the first frame plus a bunch of diffs for what's on the video in the next several frames and the audio, repeat. As it does this it can throttle what quality to send depending on the latency it's seeing or the resolution of your screen, or resize what it sends if you resize the window and so forth. This sort of streaming works particularly well for live events, but it is not without its advantages for recorded events as well--particularly random seeking. To complicate the matter of capturing the data, some streaming servers actually transmit the video data via RTMP, RTSP, or MMS protocols instead of over HTTP. HTTP Pseudo-streaming or straight HTTP downloads is going to be a lot easier to save than streaming via RTMP. Some streaming types you pretty much have to recreate the file from the individual packets or transcode it from what plays on the screen as it plays in real time. So you may need to spend some time learning about different streaming protocols to figure out the best way to save the specific stream you're looking at.
I have a problem I've been dealing with lately. My application asks its users to upload videos, to be shared with a private community. They are teaching videos, which are not always optimized for web quality to start with. The problem is, many of the videos are huge, way over the 50 megs I've seen in another question. In one case, a video was over a gig, and the only solution I had was to take the client's video from box.net, upload it to the video server via FTP, then associate it with the client's account by updating the database manually. Obviously, we don't want to deal with videos this way, we need it to all be handled automatically.
I've considered using either the box.net or dropbox API to facilitate large uploads, but would rather not go that way if I don't have to. We're using PHP for the main logic of the site, though I'm comfortable with many other languages, especially Python, but including Java, C++, or Perl. If I have to dedicate a whole server or server instance to handling the uploads, I will.
I'd rather do the client-side using native browser JavaScript, instead of Flash or other proprietary tech.
What is the final answer to uploading huge files though the web, by handling the server response in PHP or any other language?
It is possible to raise the limits in Apache and PHP to handle files of this size. The basic HTTP upload mechanism does not offer progressive information, however, so I would usually consider this acceptable only for LAN-type connections.
The normal alternative is to locate a Flash or Javascript uploader widget. These have the bonus that they can display progressive information and will integrate well with a PHP-based website.
For php http://php.net/manual/en/features.file-upload.php
Note the ini files changes in the first comment.
Edit: Assuming you are running into timeout issues.
i am using Google app engine for my development, my project involves around 60 PDfs to be available for users to download.
when i try to upload the project by clicking deploy button in eclipse i get the error app limit exceeded.
i just want to know if i try to use the paid account is there is a different in the application size in paid account or not?
as far as i know its 150 MB for now
You should use Blobstore service to store your PDF files and keep application only for files needed by your application logic and presentation, not data. Here is description of the Blobstore:
The Blobstore API allows your app to serve data objects, called blobs,
that are much larger than the size allowed for objects in the
Datastore service. Blobs are created by uploading a file through an
HTTP request. Typically, your apps will do this by presenting a form
with a file upload field to the user. When the form is submitted, the
Blobstore creates a blob from the file's contents and returns an
opaque reference to the blob, called a blob key, which you can later
use to serve the blob.
All good advice above, try to avoid putting content like that in your code. My app hit this issue and only has about 10MB of code/images/resources. What takes up a lot of space is the GWT compiling of 15 permutations of your app.
One thing that helped me, was changing my GWT javascript generation output style from Details to Obfuscated, resulting in much smaller code. You can also limit the number of permutations being created.
https://developers.google.com/web-toolkit/doc/1.6/FAQ_DebuggingAndCompiling#Can_I_speed_up_the_GWT_compiler?
According to http://code.google.com/intl/de/appengine/docs/quotas.html#Deployments the applications may not exceed 10 MB.
upto 10MB data u can upload to ur app engine
see following link
http://code.google.com/appengine/docs/quotas.html
I'm designing a server-side application that takes an image from a user, processes it, and sends it back over the network. Since the network connection might be quite slow, I'd like to speed things up by starting to process parts of the image while it is still being sent over the network and send parts of the processed image back to the client while other parts are still being processed.
Is this possible, preferably using the javax.imageio classes?
EDIT: I am mostly interested in writing PNGfiles. Wikipedia says: "IDAT contains the image, which may be split among multiple IDAT chunks. Doing so increases filesize slightly, but makes it possible to generate a PNG in a streaming manner."
This strongly depends on the encoding of the image. Some image formats require the whole file to be available before you can decode it. Others - like GIF and some PNG encodings (as far as I remember) decode to indvidual blocks which can then be processed.
You most likely need to write custom decoders, which may be quite a bit of work if you are not intimately familiar with the formats, and you need to support several.
I think you perhaps should work on an upload bar instead?