Request Method GET returns wrong content length - java

Hi i am using an HttpURLConnection that gets a txt file's content and i want to know the size of that file and i use the content length Method but it returns wrong value for example in this code the file's size is 17509 but it returns 5147 ?
so Any Help?
Thanks so much in advance :).
new Thread() {
#Override
public void run() {
String path = parser.getValue(e, "txt");
URL u = null;
try {
u = new URL(path);
HttpURLConnection c = (HttpURLConnection) u
.openConnection();
c.setRequestMethod("GET");
c.connect();
int lenghtOfFile = c.getContentLength();
InputStream in = c.getInputStream();
final ByteArrayOutputStream bo = new ByteArrayOutputStream();
byte[] buffer = new byte[1024];
long total = 0;
Log.i("p1",""+lenghtOfFile);
while ((count = in.read(buffer)) != -1) {
total += count;
Log.i("p2",""+total);
bo.write(buffer, 0, count);
}
bo.close();
} catch (MalformedURLException e) {
e.printStackTrace();
} catch (ProtocolException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
}.start();

The content-length is a header set by the server. I would check to make sure that your server is returning the correct content-length. You can do that with cUrl:
curl -v http://path/to/file.txt
That should show you the headers that were sent and returned.

A quick workaround I can think of, is just ignoring the content-length and reading input stream until there's nothing left to read.
ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream(8192);
int read = inputStream.read();
while (read != -1) {
byteArrayOutputStream.write((byte) read);
read = inputStream.read();
}
byteArrayOutputStream.flush();
buf = byteArrayOutputStream.toByteArray();

Related

How to download a remote file using Java

I'm trying to download a single file from a web server (http or https) using as few third party libraries as possible.
The method I've come up with is as follows:
private static final int BUFFER_SIZE = 8;
public static boolean download(URL url, File f) throws IOException {
URLConnection conn = url.openConnection();
conn.setDoOutput(true);
FileOutputStream out = new FileOutputStream(f);
BufferedInputStream in = new BufferedInputStream(conn.getInputStream());
byte[] buffer;
long dld = 0, expected = conn.getContentLengthLong(); // TODO expected will be -1 if the content length is unknown
while (true) { // TODO fix endless loop if server timeout
buffer = new byte[BUFFER_SIZE];
int n = in.read(buffer);
if (n == -1) break;
else dld += n;
out.write(buffer);
}
out.close();
System.out.println(dld + "B transmitted to " + f.getAbsolutePath());
return true;
}
However, it does by no means work as intended. I tried to download https://upload.wikimedia.org/wikipedia/commons/6/6d/Rubber_Duck_Florentijn_Hofman_Hong_Kong_2013d.jpg for example, the result was horrifying:
For some reason I was able to view the picture in IrfanView but not in any other viewer, so this is a re saved version.
I tried messing with the buffer size or downloading other images but the results are more or less the same.
If I look at the file, there are entire parts of the content simply replaced with dots:
I'm really lost on this one so thanks for any help :)
The problem occurs when there aren't 8 bytes of data to read. This leaves part of the array filled with zeros, which is why you're seeing so many in your hex editor. The solution is simple: replace out.write(buffer); with out.write(buffer, 0, n);. This tells the FileOutputStream to only read the bytes between indexes 0 and n.
Fixed code:
private static final int BUFFER_SIZE = 8;
public static boolean download(URL url, File f) throws IOException {
URLConnection conn = url.openConnection();
conn.setDoOutput(true);
FileOutputStream out = new FileOutputStream(f);
BufferedInputStream in = new BufferedInputStream(conn.getInputStream());
// We can move the buffer declaration outside the loop
byte[] buffer = new byte[BUFFER_SIZE];
long dld = 0, expected = conn.getContentLengthLong(); // TODO expected will be -1 if the content length is unknown
while (true) {
int n = in.read(buffer);
if (n == -1) break;
else dld += n;
out.write(buffer, 0, n);
}
out.close();
System.out.println(dld + "B transmitted to " + f.getAbsolutePath());
return true;
}
Try something like this to download pictures
public static byte[] download(String param) throws IOException {
InputStream in = null;
ByteArrayOutputStream out = null;
try {
URL url = new URL(param);
HttpURLConnection con = (HttpURLConnection)url.openConnection();
con.setConnectTimeout(120000);
con.setReadTimeout(120000);
con.setRequestMethod("GET");
con.connect();
in = new BufferedInputStream(con.getInputStream());
out = new ByteArrayOutputStream();
byte[] buf = new byte[1024];
int n = 0;
while (-1 != (n = in.read(buf))) {
out.write(buf, 0, n);
}
return out.toByteArray();
} finally {
try {
out.close();
} catch (Exception e1) {
}
try {
in.close();
} catch (Exception e2) {
}
}
}

How to write files to an SD Card from the internet?

An example would be a simple image.
I have tried so many things and it just refuses to work despite making a whole lot of sense.
What I've done so far is I'm able to grab 25 pictures and add them to
/sdcard/app name/sub/dir/filename.jpg
They all appear there according to the DDMS but they always have a filesize of 0.
I'm guessing it's probably because of my input stream?
Here's my function that handles the downloading and saving.
public void DownloadPages()
{
for (int fileC = 0; fileC < pageAmount; fileC++)
{
URL url;
String path = "/sdcard/Appname/sub/dir/";
File file = new File(path, fileC + ".jpg");
int size=0;
byte[] buffer=null;
try{
url = new URL("http://images.bluegartr.com/bucket/gallery/56ca6f9f2ef43ab7349c0e6511edb6d6.png");
InputStream in = url.openStream();
size = in.available();
buffer = new byte[size];
in.read(buffer);
in.close();
}catch(Exception e){
}
if (!new File(path).exists())
new File(path).mkdirs();
FileOutputStream out;
try{
out = new FileOutputStream(file);
out.write(buffer);
out.flush();
out.close();
}catch(Exception e){
}
}
}
It just keeps giving me 25 files in that directory but all of their file sizes are zero. I have no idea why. This is practically the same code I've used in a java program.
PS...
If you're gonna give me a solution... I've already tried code like this. It doesn't work.
try{
url = new URL(urlString);
in = new BufferedInputStream(url.openStream());
fout = new FileOutputStream(filename);
byte data[] = new byte[1024];
int count;
System.out.println("Now downloading File: " + filename.substring(0, filename.lastIndexOf(".")));
while ((count = in.read(data, 0, 1024)) != -1){
fout.write(data, 0, count);
}
}finally{
System.out.println("Download complete.");
if (in != null)
in.close();
if (fout != null)
fout.close();
}
}
Here's an image of what my directories look like
http://oi48.tinypic.com/2cpcprm.jpg
A bit change to your second option, try it as following way,
byte data[] = new byte[1024];
long total = 0;
int count;
while ( ( count = input.read(data)) != -1 )
{
total += count;
output.write( data,0,count );
}
This one is different in while statement while ((count = in.read(data, 0, 1024)) != -1)
Using Guava something like this should work:
String fileUrl = "xxx";
File file = null;
InputStream in;
FileOutputStream out;
try {
Uri url = new URI(fileUrl);
in = url.openStream();
out = new FileOutputStream(file)
ByteStreams.copy(in, out);
}
catch (IOException e) {
System.out.println(e.toString());
}
finally {
in.close();
out.flush();
out.close();
}

Getting SQLiteDatabaseCorruptException

The issue is that I'm getting the SqLiteDatabaseCorrupException while executing the next code:
ArrayList<Advertiser> arr = new ArrayList<Advertiser>();
Cursor holo = db.rawQuery("select * from Advertiser;", null);
while(holo.moveToNext()){
Advertiser adver = new Advertiser();
adver.setId(holo.getString(0));
adver.setNombre(holo.getString(1));
adver.setDescripcion(holo.getString(2));
adver.setDireccion(holo.getString(3));
adver.setContacto(holo.getString(4));
adver.setSitioWeb(holo.getString(5));
adver.setFacebook(holo.getString(6));
adver.setTwitter(holo.getString(7));
adver.setPosx(holo.getDouble(8));
adver.setPosy(holo.getDouble(9));
adver.setCiudad(holo.getString(10));
System.out.println("Objeto: " + adver.toString());
arr.add(adver);
}
What happens is that the while starts executing normally, but it gets to a point when the log cat shows that the database is corrupt and then the database gets eliminated.
Any reasons why this is happening?
EDITED:
I forgot to add, my application downloads the database when the main activity stars, does that has something to do with the database getting corrupted? The database is downloaded and stored on the sd card.
EDITED #2:
Here is the code of how I downloaded the database, hope that this helps:
public void DescargaBD() {
try {
URL url = new URL(
"http://71.6.150.179:8079/dbHandler.axd?SqliteDbVersion=0");
HttpURLConnection urlConnection = (HttpURLConnection) url
.openConnection();
urlConnection.setRequestMethod("GET");
urlConnection.setDoOutput(true);
urlConnection.connect();
File SDCardRoot = Environment.getExternalStorageDirectory();
File file = new File(SDCardRoot, "DirLaguna.db");
FileOutputStream fileOutput = new FileOutputStream(file);
InputStream inputStream = urlConnection.getInputStream();
int totalSize = urlConnection.getContentLength();
int downloadedSize = 0;
byte[] buffer = new byte[1024];
int bufferLength = 0;
while ((bufferLength = inputStream.read(buffer)) > 0) {
fileOutput.write(buffer, 0, bufferLength);
downloadedSize += bufferLength;
}
fileOutput.close();
location = file.getAbsolutePath();
} catch (MalformedURLException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
Thanks in advance.

Downloading Large JSON File to local file using Java

I'm attempting to download a JSON from the following URL - http://api.crunchbase.com/v/1/companies.js - to a local file. I'm using Java 1.7 and the following JSON Libraries - http://www.json.org/java/ - to attempt to make it work.
Here's my code:
public static void download(String address, String localFileName) {
OutputStream out = null;
URLConnection conn = null;
InputStream in = null;
try {
URL url = new URL(address);
out = new BufferedOutputStream(
new FileOutputStream(localFileName));
conn = url.openConnection();
in = conn.getInputStream();
byte[] buffer = new byte[1024];
int numRead;
long numWritten = 0;
while ((numRead = in.read(buffer)) != -1)
{
out.write(buffer, 0, numRead);
numWritten += numRead;
System.out.println(buffer.length);
System.out.println(" " + buffer.hashCode());
}
System.out.println(localFileName + "\t" + numWritten);
} catch (Exception exception) {
exception.printStackTrace();
} finally {
try {
if (in != null) {
in.close();
}
if (out != null) {
out.close();
}
} catch (IOException ioe) {
}
}
}
When I run the code everything seems to work until midway through the loop the program seems to stop and not continue reading the JSON Object.
Does anyone know why this would stop reading? How could I fix the issue?
Try This:
public void saveUrl(String filename, String urlString) throws MalformedURLException, IOException
{
BufferedInputStream in = null;
FileOutputStream fout = null;
try
{
in = new BufferedInputStream(new URL(urlString).openStream());
fout = new FileOutputStream(filename);
byte data[] = new byte[1024];
int count;
while ((count = in.read(data, 0, 1024)) != -1)
{
fout.write(data, 0, count);
}
}
finally
{
if (in != null)
in.close();
if (fout != null)
fout.close();
}
}
Does anyone know why this would stop reading? How could I fix the issue?
I can't see anything obviously wrong with the client-side code. In the absence of any other evidence on the client side, I'd look at the server-side logs to see if there are any clues there.
IMO, the most likely explanation is one of the following:
There's a bug in the server-side code that is generating the JSON and it is crashing halfway through.
The server (or a proxy / reverse proxy) has a timeout on the time allowed for some part of the interaction, and this particular request is taking too long.

Android download binary file problems

I am having problems downloading a binary file (video) in my app from the internet. In Quicktime, If I download it directly it works fine but through my app somehow it get's messed up (even though they look exactly the same in a text editor). Here is a example:
URL u = new URL("http://www.path.to/a.mp4?video");
HttpURLConnection c = (HttpURLConnection) u.openConnection();
c.setRequestMethod("GET");
c.setDoOutput(true);
c.connect();
FileOutputStream f = new FileOutputStream(new File(root,"Video.mp4"));
InputStream in = c.getInputStream();
byte[] buffer = new byte[1024];
int len1 = 0;
while ( (len1 = in.read(buffer)) > 0 ) {
f.write(buffer);
}
f.close();
I don't know if it's the only problem, but you've got a classic Java glitch in there: You're not counting on the fact that read() is always allowed to return fewer bytes than you ask for. Thus, your read could get less than 1024 bytes but your write always writes out exactly 1024 bytes possibly including bytes from the previous loop iteration.
Correct with:
while ( (len1 = in.read(buffer)) > 0 ) {
f.write(buffer,0, len1);
}
Perhaps the higher latency networking or smaller packet sizes of 3G on Android are exacerbating the effect?
new DefaultHttpClient().execute(new HttpGet("http://www.path.to/a.mp4?video"))
.getEntity().writeTo(
new FileOutputStream(new File(root,"Video.mp4")));
One problem is your reading of the buffer. If every read of the input stream is not an exact multiple of 1024 you will copy bad data. Use:
byte[] buffer = new byte[1024];
int len1 = 0;
while ( (len1 = in.read(buffer)) != -1 ) {
f.write(buffer,0, len1);
}
public class download extends Activity {
private static String fileName = "file.3gp";
private static final String MY_URL = "Your download url goes here";
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
try {
URL url = new URL(MY_URL);
HttpURLConnection c = (HttpURLConnection) url.openConnection();
c.setRequestMethod("GET");
c.setDoOutput(true);
c.connect();
String PATH = Environment.getExternalStorageDirectory()
+ "/download/";
Log.d("Abhan", "PATH: " + PATH);
File file = new File(PATH);
if(!file.exists()) {
file.mkdirs();
}
File outputFile = new File(file, fileName);
FileOutputStream fos = new FileOutputStream(outputFile);
InputStream is = c.getInputStream();
byte[] buffer = new byte[1024];
int len1 = 0;
while ((len1 = is.read(buffer)) != -1) {
fos.write(buffer, 0, len1);
}
fos.flush();
fos.close();
is.close();
} catch (IOException e) {
Log.e("Abhan", "Error: " + e);
}
Log.i("Abhan", "Check Your File.");
}
}
I fixed the code based on previous feedbacks on this thread. I tested using eclipse and multiple large files. It is working fine. Just have to copy and paste this to your environment and change the http path and the location which you would like the file to be downloaded to.
try {
//this is the file you want to download from the remote server
String path ="http://localhost:8080/somefile.zip";
//this is the name of the local file you will create
String targetFileName
boolean eof = false;
URL u = new URL(path);
HttpURLConnection c = (HttpURLConnection) u.openConnection();
c.setRequestMethod("GET");
c.setDoOutput(true);
c.connect();
FileOutputStream f = new FileOutputStream(new File("c:\\junk\\"+targetFileName));
InputStream in = c.getInputStream();
byte[] buffer = new byte[1024];
int len1 = 0;
while ( (len1 = in.read(buffer)) > 0 ) {
f.write(buffer,0, len1);
}
f.close();
} catch (MalformedURLException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (ProtocolException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
Good luck
Alireza Aghamohammadi
Just use apache's copy method (Apache Commons IO) - the advantage of using Java!
IOUtils.copy(is, os);
Do not forget to close the streams in a finally block:
try{
...
} finally {
IOUtils.closeQuietly(is);
IOUtils.closeQuietly(os);
}

Categories