JAVA image request from multipart-form - java

I'm trying to redesign my code that originally took in JSON POST. To now take in images from POST requests. The problem is that when trying to debug it keeps giving me weird Unicode for the image data. I've tried looking into this problem from scratch, but all the examples I've been finding have been used for static images already on the hard drive.
I've tried this for my code inside my JSP file and it works fine for JSON data posts. Can someone tell me if I'm on the right path at all for this?
try{
BufferedReader in = new BufferedReader(
new InputStreamReader(request.getInputStream()));
String inputLine;
StringBuffer resp = new StringBuffer();
while((inputLine = in.readLine()) != null){
resp.append(inputLine);
}
System.out.println("Images request line: \"" +resp.toString().substring(0, 200)+ "\"");
in.close();
}catch(Exception e){}

Related

Detecting error 429 while loading a URL

I am pretty new to Java, and I couldn't find a proper answer to this anywhere, so:
I'm trying to check if the loaded URL can't load because of an error or more specifically error 429 (Too many requests).
I am getting the data from the URL using this:
String finalLine = null;
URL oracle = new URL(url);
BufferedReader in = new BufferedReader(
new InputStreamReader(oracle.openStream()));
String inputLine;
while ((inputLine = in.readLine()) != null)
finalLine = inputLine;
in.close();
return finalLine;
It works very well actually, after that I'm using GSON to parse it.
When it doesn't work the game I'm working on is just crashing. So I would like to code an if function or something that handles the error and does stuff accordingly.
Is there a proper way of doing that? thanks!
If this is what you mean, you can try checking the response code using HttpURLConnection#getResponseCode().
URL oracle = new URL("http://httpstat.us/429"); // url whose response code is 429
HttpURLConnection openConnection = (HttpURLConnection) oracle.openConnection();
if (openConnection.getResponseCode() == 429) {
System.out.println("Too many request");
} else {
System.out.println("ok");
}
openConnection.disconnect();

How to retrieve XML/RDF data from a link URL using in Java?

I have been trying to retrieve XML data from a URL and write to file on disk http://dbpedia.org/data/Berlin.rdf using the following code snippet.
URL urlObj = new URL("http://dbpedia.org/data/Berlin.rdf");
java.net.HttpURLConnection connection = (HttpURLConnection) urlObj.openConnection();
InputStream reader = new BufferedInputStream(connection.getInputStream());
BufferedReader breader = new BufferedReader(new InputStreamReader(reader));
String line;
BufferedWriter writer = new BufferedWriter(new eWriter("resource.xml"));
while ((line = breader.readLine()) != null) {
// writes the line to the output file
writer.write(line);
System.out.println(line);
}
writer.close();
connection.disconnect();
But I get this error: Exception in thread "main" java.io.IOException: Server returned HTTP response code: 502 for URL: http://dbpedia.org/data/Berlin.rdf
What is wrong ? How to fix this ? Thanks in advance.
A 502 HTTP Error is a Server Error.
If you go to the site (http://dbpedia.org/data/Berlin.rdf), you will see that dbpedia is currently undergoing maintenance. Go back in a couple of hours and try again and your code should work fine.
Update: It's working fine now.

Fetch source code of web page using java?

I have a URL like this and the following method
public static void saveContent( String webURL )throws Exception
{
URL website = new URL(webURL);
URLConnection connection = website.openConnection();
BufferedReader in = new BufferedReader(
new InputStreamReader(
connection.getInputStream()));
StringBuilder response = new StringBuilder();
String inputLine;
while ((inputLine = in.readLine()) != null)
response.append(inputLine);
in.close();
System.out.println(response.toString());
}
However, When I want to print web content, it always fetches the source code of the main page(www.google.com).
How can I solve my problem ? Thanks for your help.
I copied yours code to netbeans and it seems to work correctly. I think the problem could lead on content in method argument "webURL". Try run your app on debug mode and look what you've got back there.

retrieve single image via its url in java

I have a problem and I hope that you can help me. I would appreciate any help from anyone. The problem is the following.
I have a camera that has an http service, and I am communicating with the camera using the http. So the problem is that I send http request and I have back an http response in which I have a binary jpeg data. But I do not know how to convert that data into picture.
So my question is how can I convert that binary data into picture with java?
This is one example
http request:
GET (url to picture)
http response:
binary jpeg data
I thank to all of you in forward for all of your help.
URL url = new URL("http://10.10.1.154" + GETIMAGESCR());
BufferedReader in = new BufferedReader(new InputStreamReader(url.openStream()));
String inputLine;
// while ((inputLine = in.readLine()) != null){
// inputLine = in.readLine();
File file = new File("D:\\alphas\\proba.bin");
boolean postoi = file.createNewFile();
FileWriter fstream = new FileWriter("D:\\alphas\\proba.bin");
BufferedWriter out = new BufferedWriter(fstream);
while ((inputLine = in.readLine()) != null){
out.write(in.readLine());
// out.close();
// System.out.println("File created successfully.");
System.out.println(inputLine);
}
System.out.println("File created successfully.");
out.close();
in.close();
With this code I am getting the binary JPEG data, and I menage to save the data in a file. So the question is now how to convert this data into picture, or how to create the picture?
By the way I do not need to save the file that I get, if you have a way to create the picture directly it would be the best way
retrieve single image via its url in java
you just need to write byte data of image in response and set the proper content type, It will serve image from servlet
try {
URL url = new URL("http://site.com/image.jpeg");
java.awt.Image image = java.awt.Toolkit.getDefaultToolkit().createImage(url);
} catch (MalformedURLException e) {
} catch (IOException e) {
}
Am I missing something or are you just looking for this:
new ImageIcon(new URL("http://some.link.to/your/image.jpg"));
If you need to save the data from the URL, then just read the bytes from the corresponding InputStream and write the read bytes to a FileOutputStream:

Optimized option for getting text from a web page

I used url.openConnection() to get text from a webpage
but i got time delay in execution while i tried it in loops
i also tried httpUrl.disconnect().
but the change is not that much...
can anyone give me a better option for this
i used the following code for this
for(int i=0;i<10;i++){
URL google = new URL(array[i]);//array of links
HttpURLConnection yc =(HttpURLConnection)google.openConnection();
BufferedReader in = new BufferedReader(new InputStreamReader(yc.getInputStream()));
String inputLine;
while ((inputLine = in.readLine()) != null) {
source=source.concat(inputLine);
}
in.close();
yc.disconnect();
}
A couple of issues I can see.
in.readLine() doesn't retain the newline so when you use concat, all the newlines have been removed.
Using concat in a loop like this builds a longer and longer String. This will get slower and slower with each line you add.
Instead you might find IOUtils useful.
URL google = new URL("123newyear.com/2011/calendars/");
String text = IOUtils.toString(google.openConnection().getInputStream());
See Reading Directly from a URL for details on how to to get a stream from which you can read the contents of the URL.
Basically, you
Create a url URL url = new URL("123newyear.com/2011/calendars/";
Call openstream() on the URL object
BufferedReader in = new BufferedReader(new InputStreamReader(url.openStream()));
Read from the stream (like you did).

Categories