There is a file that will be downloaded when I make a get request to particular URL. I am able to get InputStream from both ways.
Method 1
Using URL class in java.net package.
java.net.URL url = new URL(downloadFileUrl);
InputStream inputStream = url.openStream();
Method 2
Using Apache's HttpClient class.
org.apache.http.impl.client.CloseableHttpClient httpclient = new CloseableHttpClient();
HttpGet request = new HttpGet(url);
CloseableHttpResponse response = httpclient.execute((HttpUriRequest)request);
InputStream inputStream = response.getEntity().getContent();
Are these methods the same? If not how? Which method is preferred generally or in a specific situation?
The examples I provided are simplistic. Assume I did the neccessary
congifurations with the URL and HttpClient objects to get successful response.
Both methods returns the input stream to read from the connection. There isn't difference between these methods. Since HttpClient is third party library, you need to keep a check for any vulnerabilities and keep updating the library.
Only difference is HttpClient supports only HTTP(s) protocol, whereas URLConnection can be used for other protocols too like FTP
In terms of functionalities, Apache HttpClient has more fine tuning options than URLConnection
Related
Using URL class in java.net package.
Method 1
String sourceUrl = "https://thumbor.thedailymeal.com/P09kUdGYdBReFSJne1qjVDIphDM=//https://videodam-assets.thedailymeal.com/filestore/5/3/0/2_37ec80e4c368169/5302scr_43fcce37a98877f.jpg%3Fv=2020-03-16+21%3A06%3A42&version=0";
java.net.URL url = new URL(sourceUrl);
InputStream inputStream = url.openStream();
Files.copy(inputStream, Paths.get("/Users/test/rr.png"), StandardCopyOption.REPLACE_EXISTING);
Using Apache's HttpClient class.
Method 2
String sourceUrl = "https://thumbor.thedailymeal.com/P09kUdGYdBReFSJne1qjVDIphDM=//https://videodam-assets.thedailymeal.com/filestore/5/3/0/2_37ec80e4c368169/5302scr_43fcce37a98877f.jpg%3Fv=2020-03-16+21%3A06%3A42&version=0";
CloseableHttpClient httpclient = HttpClients.createDefault();
HttpGet httpget = new HttpGet(sourceUrl);
HttpResponse httpresponse = httpclient.execute(httpget);
InputStream inputStream = httpresponse.getEntity().getContent();
Files.copy(inputStream, Paths.get("/Users/test/rr.png"), StandardCopyOption.REPLACE_EXISTING);
I have downloaded the rr.png file using both the methods. I found both the files are different even in sizes also and using method 2 download a blank image. I have read both the methods are same but I do not understand why method1 downloading correct file and method2 downloading wrong file. Please clarify this and also let me know if there is a fix in the method 2 through which I can download the correct file.
First: cross-posting: https://coderanch.com/t/728266/java/URL-openStream-HttpResponse-getEntity-getContent
Second: I guess the issue is the url and how it's handled differently by javas internal class and apache lib - use a debugger and step through them to see what url really gets send out the tls stream.
I am trying to consume a RESTFUL web service using Java(HttpURLConnection and InputStream).I am able to print the response using BufferedReader, but it returns a response header as well and the format is causing issues to convert it to a Java POJO.
I tried using a URLConnection and then retrieving the input stream and passing it to the ObjectMapping(provided by Jackson)
final URL url = new URL("url");
final HttpURLConnection uc = (HttpURLConnection) url.openConnection();
uc.setRequestMethod("GET");
final ObjectMapper objectMapper = new ObjectMapper();
MyData myData = objectMapper.readValue(uc.getInputStream(), MyData.class);
Error Message : "No content to map due to end-of-input\n"
In your code you don't show where you actually read the data and where you declared and filled your output variable. As code is now it seems to be the incorrect reading from your rest service. But instead of writing your own code to read fro rest url I would suggest to use the 3d party library that does it for you. Here is few suggestions: Apache Http Client, OK Http client and finally my favorite - MgntUtils Http Client (library written and maintained by me) Here is the HttpClient javadoc, Here is the link to The latest Maven artifacts for MgntUtils library and here MgntUtils Github link that contains library itself with sources and javadoc. Choose some Http Client and read the content using that client and then you can use the content.
I've successfully managed to logon to a site using httpclient and print out the cookies that enable that logon.
However, I am now stuck because I wanted to display subsequent pages in a JEditorPane using .setPage(url) function. However, when I do that and analyse my GET request using Wireshark I see that the user agent is not my httpclient but the following:
User-Agent: Java/1.6.0_17
The GET request (which is coded somewhere in side jeditorpane's setPage(URL url) method) does not have the cookies that were retrieved using the httpclient. My question is - how can I somehow transfer the cookies received with httpclient so that my JEditorPane can display URLs from the site?
I'm beginning to think it's not possible and I should try and logon using normal Java URLconnection etc but would rather stick with httpclient as it's more flexible (I think). Presumably I would still have a problem with the cookies??
I had thought of extending the JEditorPane class and overriding the setPage() but I don't know the actual code I should put in it as can't seem to find out how setPage() actually works.
Any help/suggestions would be greatly appreciated.
Dave
As I mentioned in the comment, HttpClient and the URLConnection used by the JEditorPane to fetch the URL content don't talk to each other. So, any cookies that HttpClient may have fetched won't transfer over to the URLConnection. However, you can subclass JEditorPane like so :
final HttpClient httpClient = new DefaultHttpClient();
/* initialize httpClient and fetch your login page to get the cookies */
JEditorPane myPane = new JEditorPane() {
protected InputStream getStream(URL url) throws IOException {
HttpGet httpget = new HttpGet(url.toExternalForm());
HttpResponse response = httpClient.execute(httpget);
HttpEntity entity = response.getEntity();
// important! by overriding getStream you're responsible for setting content type!
setContentType(entity.getContentType().getValue());
// another thing that you're now responsible for... this will be used to resolve
// the images and other relative references. also beware whether it needs to be a url or string
getDocument().putProperty(Document.StreamDescriptionProperty, url);
// using commons-io here to take care of some of the more annoying aspects of InputStream
InputStream content = entity.getContent();
try {
return new ByteArrayInputStream(IOUtils.toByteArray(content));
}
catch(RuntimeException e) {
httpget.abort(); // per example in HttpClient, abort needs to be called on unexpected exceptions
throw e;
}
finally {
IOUtils.closeQuietly(content);
}
}
};
// now you can do this!
myPane.setPage(new URL("http://www.google.com/"));
By making this change, you'll be using HttpClient to fetch the URL content for your JEditorPane. Be sure to read the JavaDoc here http://download.oracle.com/javase/1.4.2/docs/api/javax/swing/JEditorPane.html#getStream(java.net.URL) to make sure that you catch all the corner cases. I think I've got most of them sorted, but I'm not an expert.
Of course, you can change around the HttpClient part of the code to avoid loading the response into memory first, but this is the most concise way. And since you're going to be loading it up into an editor, it will all be in memory at some point. ;)
Under Java 5 & 6, there is a default cookie manager which "automatically" supports HttpURLConnection, the type of connection JEditorPane uses by default.
Based on this blog entry, if you write something like
CookieManager manager = new CookieManager();
manager.setCookiePolicy(CookiePolicy.ACCEPT_NONE);
CookieHandler.setDefault(manager);
seems enough to support cookies in JEditorPane.
Make sure to add this code before any internet communication with JEditorPane takes place.
Sorry, I'm quite new to Java.
I've stumbled across HttpGet and HttpPost which seem to be perfect for my needs, but a little long winded. I have written a rather bad wrapper class, but does anyone know of where to get a better one?
Ideally, I'd be able to do
String response = fetchContent("http://url/", postdata);
where postdata is optional.
Thanks!
HttpClient sounds like what you want. You certainly can't do stuff like the above in one line, but it's a fully-fledged HTTP library that wraps up Get/Post requests (and the rest).
I would consider using the HttpClient library. From their documentation, you can generate a POST like this:
PostMethod post = new PostMethod("http://jakarata.apache.org/");
NameValuePair[] data = {
new NameValuePair("user", "joe"),
new NameValuePair("password", "bloggs")
};
post.setRequestBody(data);
// execute method and handle any error responses.
...
InputStream in = post.getResponseBodyAsStream();
// handle response.
There are a number of advanced options for configuring the client should you eventually required those.
I need to establish and send/read over/from an https connection (to a website of course) but through an http proxy or SOCKS proxy. A few other requirements
supports blocking (I can't use non-blocking/nio)
isn't set as an environment or some other global scope property (there are multiple threads accessing)
I was looking into HttpCore components but I did not see any support for blocking https.
Look at the java.net.Proxy class. That does what you need. You create one, and then pass it to the URLConnection to create the connection.
To support per-thread proxy, your best bet is Apache HttpClient 4 (Http Components Client). Get the source code,
http://hc.apache.org/downloads.cgi
It comes with examples for both HTTP proxy and SOCKS proxy,
ClientExecuteProxy.java
ClientExecuteSOCKS.java
Did you look at Apache HTTP Client? Haven't used it in ages but I did use it to pick a proxy server dynamically. Example from site here:
HttpClient httpclient = new HttpClient();
httpclient.getHostConfiguration().setProxy("myproxyhost", 8080);
httpclient.getState().setProxyCredentials("my-proxy-realm", " myproxyhost",
new UsernamePasswordCredentials("my-proxy-username", "my-proxy-password"));
GetMethod httpget = new GetMethod("https://www.verisign.com/");
try {
httpclient.executeMethod(httpget);
System.out.println(httpget.getStatusLine());
} finally {
httpget.releaseConnection();
}
System.setProperty("http.proxyHost", "proxy.com");
System.setPropery("http.proxyPort", "8080");
URL url = new URL("http://java.sun.com/");
InputStream in = url.openStream();
http://java.sun.com/javase/6/docs/technotes/guides/net/proxies.html