Open a downloaded html file with WebView - java

I have an app that opens certain webpages with webview. If there is internet connection, the webview opens a certain url and downloads html file. If there is no internet connection, the webview is supposed to open the previously downloaded html file.
This is how I'm trying to do it:
webView.loadUrl(Environment.getExternalStorageDirectory().toString() + "/Android/data/com.whizzapps.stpsurniki/" + razred + ".html");
The path is 100% right, but it still won't show it for some reason. I did some research and I saw that people usually put the downloaded html file in assets folder, but I'm downloading the html file inside application so I don't really have access to assets folder. What should I do?

you can use loadData instead, but you need to read the file first:
data = readFile(Environment.getExternalStorageDirectory().toString() + "/Android/data/com.whizzapps.stpsurniki/" + razred + ".html");
webView.loadData(data, "text/html; charset=UTF-8", null);
//or
//webView.loadDataWithBaseURL(null, result, "text/html; charset=UTF-8", null, null);
here is a function to read the file
private String readFile(String path) throws IOException
{
StringBuilder sb = new StringBuilder();
BufferedReader br = new BufferedReader(new FileReader(path));
try
{
String line = null;
while ((line = br.readLine())!=null)
{
sb.append(line);
}
}
finally
{
br.close();
}
return sb.toString();
}

Related

Android Studio's phone emulator cannot read your local file system: URL.openStream throws FileNotFoundException

For an Android app, I was trying to read a file into a string via URL.openStream. In the future, this will be a remote URL, but for now, I'm thought I'd try testing with a local file. I was pretty sure the URL was correct because when I cut&paste the URL into Chrome it displays the file nicely. The screenshot shows the values of the variables and the Chrome display of the file:
static private String readFromURL( String urlstr )
{
StringBuilder sb = new StringBuilder();
try
{
URL url = new URL( urlstr );
InputStream strm = url.openStream();
InputStreamReader rdr = new InputStreamReader( strm );
BufferedReader reader = new BufferedReader( rdr );
String line;
while ( (line = reader.readLine()) != null )
sb.append( line );
}
catch ( Exception ex )
{ return "Error during read from server: " + ex.toString(); }
return sb.toString();
}
Alas, I completely forgot something ...
OMG I just realized that the phone emulator has no access to my local file systems. Which means that I cannot do a 'simple' test like that, the data always has to be got from a real URL on the Internet.
Or I'd have to copy some data to the phone emulator when it starts, but that would make things unnecessarily complicated.

Getting incomplete HTML source on url.openConnection()

I am trying to get HTML page source for a website. But I am not able to get some image links, which I think are populated dynamically on the webpage.
I am using java as:
url = new URL(firstLevelURL);
connection = (HttpURLConnection) url.openConnection();
try ( // Read all the text returned by the server
BufferedReader br = new BufferedReader(new InputStreamReader(connection.getInputStream(), "UTF-8"))) {
// Read each line of "in" until done, adding each to "response"
while ((str = br.readLine()) != null) {
// str is one line of text readLine() strips newline characters
//I am not able to get this image as it is loaded dynamically using javascript/ajax or something.
if(str.contains("<img id=\"tileImage")) {
response = str;
break;
}
}
}
I tried using :
connection.setReadTimeout(15*1000);
But the page is still not loading completely
Is there any way to wait for page to load completely before fetching HTML source

how to download multiple webpages that use a 'next' button

I am trying to download the latest HTML code from this website, until recently the URL displayed all the information I needed. Recently the web designer changed the format so a portion of the data is displayed and the user must hit the 'next' button to display next portion of data.
The URL doesn't change though.
Anyone know how I can download all the information using JAVA??
Thanks. This is my current code:
[code]
URL url = null;
InputStream is = null;
BufferReader br;
String line;
try {
url = new URL("HTTP://...../..../...");
is = url.openStream();
br = new BufferedReader(new InputStreamReader(is));
while ( (line = br.readLine() ) != null)
System.out.println(line);
} catch(IOException e) {
}
....
[/code]

Extract HTML from URL

I'm using Boilerpipe to extract text from url, using this code:
URL url = new URL("http://www.example.com/some-location/index.html");
String text = ArticleExtractor.INSTANCE.getText(url);
the String text contains just the text of the html page, but I need to extract to whole html code from it.
Is there anyone who used this library and knows how to extract the HTML code?
You can check the demo page for more info on the library.
For something as simple as this you don't really need an external library:
URL url = new URL("http://www.google.com");
InputStream is = (InputStream) url.getContent();
BufferedReader br = new BufferedReader(new InputStreamReader(is));
String line = null;
StringBuffer sb = new StringBuffer();
while((line = br.readLine()) != null){
sb.append(line);
}
String htmlContent = sb.toString();
Just use the KeepEverythingExtractor instead of the ArticleExtractor.
But this is using the wrong tool for the wrong job. What you want is just to download the HTML content of a URL (right?), not extract content. So why use a content extractor?
With Java 7 and a trick of Scanner, you can do the following:
public static String toHtmlString(URL url) throws IOException {
Objects.requireNonNull(url, "The url cannot be null.");
try (InputStream is = url.openStream(); Scanner sc = new Scanner(is)) {
sc.useDelimiter("\\A");
if (sc.hasNext()) {
return sc.next();
} else {
return null; // or empty
}
}
}

Java and FTP to edit online text files

In my Java swing program, I read, edit and save various text files in a local folder using Scanner and BufferedWriter. Is there an easy way I can keep my current code, but, using FTP, edit a web file rather than a local file? Thanks everyone.
You can use the URL and URLConnection classes to obtain InputStreams and OutputStreams to files located on an FTP Server.
To read a file
URL url = new URL("ftp://user:pass#my.ftphost.com/myfile.txt");
InputStream in = url.openStream();
to write a file
URL url = new URL("ftp://user:pass#my.ftphost.com/myfile.txt");
URLConnection conn = url.openConnection();
conn.setDoOutput(true);
OutputStream out = conn.getOutputStream();
I tried to achieve the same and the answers to those questions helped me a lot:
Adding characters to beginning and end of InputStream in Java (the one marked as right shows how to add a custom string to the InputStream)
Uploading a file to a FTP server from android phone? (the one from Kumar Vivek Mitra shows how to upload a file)
I added new text to the end of my online file like this:
FTPClient con = null;
try {
con = new FTPClient();
con.connect(Hostname);
if (con.login(FTPUsername, FTPPassword)) {
con.enterLocalPassiveMode(); // important!
con.setFileType(FTP.BINARY_FILE_TYPE);
InputStream onlineDataIS = urlOfOnlineFile.openStream();
String end = "\nteeeeeeeeeeeeeeeeest";
List<InputStream> streams = Arrays.asList(
onlineDataIS,
new ByteArrayInputStream(end.getBytes()));
InputStream resultIS = new SequenceInputStream(Collections.enumeration(streams));
// Stores a file on the server using the given name and taking input from the given InputStream.
boolean result = con.storeFile(PathOfTargetFile, resultIS);
onlineDataIS.close();
resultIS.close();
if (result) Log.v("upload result", "succeeded");
con.logout();
con.disconnect();
}
return "Writing successful";
} catch (IOException e) {
// some smart error handling
}
Hope that helps.

Categories