How to prevent file downloading from url if file already downloaded - java

I need to download files with the possibility to download parts after disconnection. I have an error when trying to download a file that is already completely downloaded. I get 416 error code from the server when trying to get inputstream. I fixed it by adding a condition downloadedFileSize < urlConnection.getContentLength(). Here my code:
try {
File file = Path.of(downloadOutputFolder, fileName).toFile();
boolean append = false;
connection = (HttpURLConnection) url.openConnection();
if (file.exists()) {
connection.setRequestProperty("Range", String.format("bytes=%s-", file.length()));
append = true;
}
if (!file.exists() || file.exists() && file.length() < getFileSizeFromUrl(url)) {
try (InputStream downloadStream = connection.getInputStream();
FileOutputStream fileOutputStream = new FileOutputStream(file, append)) {
int byteRead;
byte[] buffer = new byte[BUFFER_SIZE];
while ((byteRead = downloadStream.read(buffer)) != -1) {
fileOutputStream.write(buffer, 0, byteRead);
}
}
}...
private int getFileSizeFromUrl(URL url) {
HttpURLConnection connection = null;
try {
connection = (HttpURLConnection) url.openConnection();
return connection.getContentLength();
} catch (IOException e) {
log.error("Error during open URL connection to {}", url, e);
} finally {
if (connection != null) {
connection.disconnect();
}
}
return -1;
}
Is it possible to handle these case with url connection api? Not by adding condition file.length()<getFileSizeFromUrl(url).

Related

Android - File Corrupts while saving

I am saving a file to disk after downloading it from server, but I believe it gets corrupted while saving on the disc. If the same file is downloaded using chrome on mac or using any other method, the file downloads and reads normally. The corruption seems to be in the saving process of the file. I am adding the code to help find out the problem. The file is a css file.
Corruption:
Some whitespace sort of characters appear when reading the file. A surprising thing that I tried and noticed is that if I reduce the BUFFER_SIZE to 32 from 4096, the file does not get corrupt, I couldn't figure out why. Also, reducing BUFFER_SIZE reduces whitespaces / corrupted characters.
Appreciate any pointers in the right direction. Thanks
private static final int BUFFER_SIZE = 4096;
// saves file to disk and returns the contents of the file.
public static String downloadFile(Context context, String filePath, String destParent) {
String content = null;
StringBuilder sb = new StringBuilder();
HttpURLConnection connection = null;
InputStream is = null;
FileOutputStream os = null;
String sUrl = Urls.makeWebAssetUrl(filePath); /// consider this my file URL
String destFile = getContextBaseDir(context) + (destParent != null ? File.separator + destParent : "") + File.separator + filePath;
try {
URL url = new URL(sUrl);
connection = (HttpURLConnection) url.openConnection();
connection.connect();
int responseCode = connection.getResponseCode();
if (responseCode == HttpURLConnection.HTTP_OK) {
File outFile = new File(destFile);
if (!outFile.getParentFile().exists()) {
if (!outFile.getParentFile().mkdirs()) {
throw new RuntimeException("Unable to create parent directories for " + filePath);
}
}
is = connection.getInputStream();
os = new FileOutputStream(outFile);
int bytesRead = 0;
byte[] buffer = new byte[BUFFER_SIZE];
while ((bytesRead = is.read(buffer)) != -1) {
sb.append(new String(buffer, 0, bytesRead, DEFAULT_ENCODING));
os.write(buffer);
}
content = sb.toString();
}
else {
LogUtils.LOGW(TAG, responseCode + " while connecting to " + sUrl + ": " + connection.getResponseMessage());
}
} catch(Exception e) {
LogUtils.LOGE(TAG, "Error while downloading " + sUrl, e);
} finally {
if (is != null) {
try {
is.close();
} catch (IOException e) {
LogUtils.LOGE(TAG, "Error closing inputStream while downloading " + sUrl, e);
}
}
if (os != null) {
try {
os.flush();
} catch (IOException e) {
LogUtils.LOGE(TAG, "Error flushing outputStream while downloading " + sUrl, e);
}
try {
os.close();
} catch (IOException e) {
LogUtils.LOGE(TAG, "Error closing outputStream while downloading " + sUrl, e);
}
}
}
return content;
}
os.write(buffer);
The problem is here. It should be:
os.write(buffer, 0, bytesRead);
I don't know why you are also accumulating the content in a StringBuffer and returning it as a String. That won't scale, and in any cast it's redundant. Remove.

How can i pause download process via asynctask

I'm making an app with a download process, it will download a certain file. If I turn off the Wi-Fi when the file is downloading, the app crashes.
This is the log: recvfrom failed: ETIMEDOUT (Connection timed out)
I have a conditional, but it seems not to work. If I debug the code, it seems to enter the conditional.
else {
Thread.sleep(4000); //doesn't work, doesn't sleep
downloadresult = false;
}
I want the download process to pause when I turn off the Wi-Fi. Is there any way of doing this?
Thanks in advance.
All code
protected String doInBackground(String... f_url) {
try {
long total = 0;
URL url = new URL(f_url[0]);
HttpURLConnection conection = (HttpURLConnection) url.openConnection();
int lenghtOfFile = conection.getContentLength();
BufferedOutputStream output = new BufferedOutputStream(new FileOutputStream(file));
conection.connect();
BufferedInputStream input = new BufferedInputStream(conection.getInputStream());
byte data[] = new byte[8192];
int lastcount = 0;
while ((count = input.read(data)) != -1) {
if (isCanceled) { // this code waiting the click button :)
file.delete();
downloadresult = false;
break;
}
if (intCheck()) { // check internet and download
total += count;
downloadresult = true;
int ProgBarCount = (int) ((total * 100) / lenghtOfFile);
if (ProgBarCount > lastcount) {
lastcount = ProgBarCount;
publishProgress(Integer.toString(ProgBarCount));
}
output.write(data, 0, count);
}
else {
Thread.sleep(4000); //doesn't work, doesn't sleep
downloadresult = false;
}
}
output.flush();
output.close();
input.close();
}
catch (Exception e) {
e.printStackTrace();
exmessage = e.getMessage().toString();
downloadresult = false;
}
return null;
}
If I debug the code, it works perfectly. If the app can't dowload the file, I want the app to wait 4000ms and then try again, but if I run the app, it crashes.
How can i pause/resume the download process. Thank You
I fixed the problem :)
Thanks for all respond, I love it <3
Code:
protected String doInBackground(String... f_url) {
try {
long total = 0;
URL url = new URL(f_url[0]);
HttpURLConnection connection = (HttpURLConnection) url.openConnection();
/* if (file.exists())
{
connection.setAllowUserInteraction(true);
connection.setRequestProperty("Range", "bytes=" + lenghtOfFile + "-");
}*/
if(file.exists()){
deneme = file.length();
connection.setRequestProperty("Range", "bytes="+(file.length())+"-");
}
else{
connection.setRequestProperty("Range", "bytes=" + deneme + "-");
}
String connectionField = connection.getHeaderField("content-range");
if (connectionField != null)
{
String[] connectionRanges = connectionField.substring("bytes=".length()).split("-");
deneme = Long.valueOf(connectionRanges[0]);
}
if (connectionField == null && file.exists())
file.delete();
connection.setConnectTimeout(14000);
connection.setReadTimeout(20000);
connection.connect();
long lenghtOfFile = connection.getContentLength() + deneme;
RandomAccessFile output = new RandomAccessFile(file,"rw");
BufferedInputStream input = new BufferedInputStream(connection.getInputStream());
output.seek(deneme);
byte data[] = new byte[1024];
int lastcount = 0;
while ((count = input.read(data,0,1024)) != -1) {
if (isCanceled) { // this code waiting the click button :)
file.delete();
downloadresult = false;
break;
}
if (intCheck()) { // check internet and download
total += count;
downloadresult = true;
int ProgBarCount = (int) ((total * 100) / lenghtOfFile);
if (ProgBarCount > lastcount) {
lastcount = ProgBarCount;
publishProgress(Integer.toString(ProgBarCount));
}
output.write(data, 0, count);
}
}
// output.flush();
output.close();
input.close();
}
catch (Exception e) {
e.printStackTrace();
exmessage = e.getMessage().toString();
downloadresult = false;
}
return null;
}
Don't pause background tasks. When anything goes wrong cancel the task. Remember your app was smart enough to start a background task. It will be smart enough to restart it again later. If your going to pause a background task it should be paused only at the users bequest.

Request Method GET returns wrong content length

Hi i am using an HttpURLConnection that gets a txt file's content and i want to know the size of that file and i use the content length Method but it returns wrong value for example in this code the file's size is 17509 but it returns 5147 ?
so Any Help?
Thanks so much in advance :).
new Thread() {
#Override
public void run() {
String path = parser.getValue(e, "txt");
URL u = null;
try {
u = new URL(path);
HttpURLConnection c = (HttpURLConnection) u
.openConnection();
c.setRequestMethod("GET");
c.connect();
int lenghtOfFile = c.getContentLength();
InputStream in = c.getInputStream();
final ByteArrayOutputStream bo = new ByteArrayOutputStream();
byte[] buffer = new byte[1024];
long total = 0;
Log.i("p1",""+lenghtOfFile);
while ((count = in.read(buffer)) != -1) {
total += count;
Log.i("p2",""+total);
bo.write(buffer, 0, count);
}
bo.close();
} catch (MalformedURLException e) {
e.printStackTrace();
} catch (ProtocolException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
}.start();
The content-length is a header set by the server. I would check to make sure that your server is returning the correct content-length. You can do that with cUrl:
curl -v http://path/to/file.txt
That should show you the headers that were sent and returned.
A quick workaround I can think of, is just ignoring the content-length and reading input stream until there's nothing left to read.
ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream(8192);
int read = inputStream.read();
while (read != -1) {
byteArrayOutputStream.write((byte) read);
read = inputStream.read();
}
byteArrayOutputStream.flush();
buf = byteArrayOutputStream.toByteArray();

Getting SQLiteDatabaseCorruptException

The issue is that I'm getting the SqLiteDatabaseCorrupException while executing the next code:
ArrayList<Advertiser> arr = new ArrayList<Advertiser>();
Cursor holo = db.rawQuery("select * from Advertiser;", null);
while(holo.moveToNext()){
Advertiser adver = new Advertiser();
adver.setId(holo.getString(0));
adver.setNombre(holo.getString(1));
adver.setDescripcion(holo.getString(2));
adver.setDireccion(holo.getString(3));
adver.setContacto(holo.getString(4));
adver.setSitioWeb(holo.getString(5));
adver.setFacebook(holo.getString(6));
adver.setTwitter(holo.getString(7));
adver.setPosx(holo.getDouble(8));
adver.setPosy(holo.getDouble(9));
adver.setCiudad(holo.getString(10));
System.out.println("Objeto: " + adver.toString());
arr.add(adver);
}
What happens is that the while starts executing normally, but it gets to a point when the log cat shows that the database is corrupt and then the database gets eliminated.
Any reasons why this is happening?
EDITED:
I forgot to add, my application downloads the database when the main activity stars, does that has something to do with the database getting corrupted? The database is downloaded and stored on the sd card.
EDITED #2:
Here is the code of how I downloaded the database, hope that this helps:
public void DescargaBD() {
try {
URL url = new URL(
"http://71.6.150.179:8079/dbHandler.axd?SqliteDbVersion=0");
HttpURLConnection urlConnection = (HttpURLConnection) url
.openConnection();
urlConnection.setRequestMethod("GET");
urlConnection.setDoOutput(true);
urlConnection.connect();
File SDCardRoot = Environment.getExternalStorageDirectory();
File file = new File(SDCardRoot, "DirLaguna.db");
FileOutputStream fileOutput = new FileOutputStream(file);
InputStream inputStream = urlConnection.getInputStream();
int totalSize = urlConnection.getContentLength();
int downloadedSize = 0;
byte[] buffer = new byte[1024];
int bufferLength = 0;
while ((bufferLength = inputStream.read(buffer)) > 0) {
fileOutput.write(buffer, 0, bufferLength);
downloadedSize += bufferLength;
}
fileOutput.close();
location = file.getAbsolutePath();
} catch (MalformedURLException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
Thanks in advance.

Downloading Large JSON File to local file using Java

I'm attempting to download a JSON from the following URL - http://api.crunchbase.com/v/1/companies.js - to a local file. I'm using Java 1.7 and the following JSON Libraries - http://www.json.org/java/ - to attempt to make it work.
Here's my code:
public static void download(String address, String localFileName) {
OutputStream out = null;
URLConnection conn = null;
InputStream in = null;
try {
URL url = new URL(address);
out = new BufferedOutputStream(
new FileOutputStream(localFileName));
conn = url.openConnection();
in = conn.getInputStream();
byte[] buffer = new byte[1024];
int numRead;
long numWritten = 0;
while ((numRead = in.read(buffer)) != -1)
{
out.write(buffer, 0, numRead);
numWritten += numRead;
System.out.println(buffer.length);
System.out.println(" " + buffer.hashCode());
}
System.out.println(localFileName + "\t" + numWritten);
} catch (Exception exception) {
exception.printStackTrace();
} finally {
try {
if (in != null) {
in.close();
}
if (out != null) {
out.close();
}
} catch (IOException ioe) {
}
}
}
When I run the code everything seems to work until midway through the loop the program seems to stop and not continue reading the JSON Object.
Does anyone know why this would stop reading? How could I fix the issue?
Try This:
public void saveUrl(String filename, String urlString) throws MalformedURLException, IOException
{
BufferedInputStream in = null;
FileOutputStream fout = null;
try
{
in = new BufferedInputStream(new URL(urlString).openStream());
fout = new FileOutputStream(filename);
byte data[] = new byte[1024];
int count;
while ((count = in.read(data, 0, 1024)) != -1)
{
fout.write(data, 0, count);
}
}
finally
{
if (in != null)
in.close();
if (fout != null)
fout.close();
}
}
Does anyone know why this would stop reading? How could I fix the issue?
I can't see anything obviously wrong with the client-side code. In the absence of any other evidence on the client side, I'd look at the server-side logs to see if there are any clues there.
IMO, the most likely explanation is one of the following:
There's a bug in the server-side code that is generating the JSON and it is crashing halfway through.
The server (or a proxy / reverse proxy) has a timeout on the time allowed for some part of the interaction, and this particular request is taking too long.

Categories