I'm currently trying to build an OSGi service that sends a POST request to a defined API. This API is used to virus-scan a file which is contained in the request body (JSON) as Base64 string.
For this, I am using Apache HttpClient contained in Adobe AEM uberjar v6.4.0
My current implementation works fine for smaller files (<2 MB), but as filesize gets bigger, the behaviour gets strange:
When I upload a 9 MB file, the request executes for ~1 minute, then gets a HTTP400 as response and afterwards retrys the request 7 times.
I tried to use a timeout with the request. If the timeout is below 60.000ms, a TimeoutException is thrown, if it's greater than 60.000ms, I get a HTTP400 Bad Request. I guess the latter is the APIs fault which I need to clarify.
However, in both cases after the exception is thrown, httpClient retries the request and I have not been able to prevent that since. I'm struggeling with many deprecated "HowTo's" on the web and now I'm here.
I have shortened the code a bit, as it's somehow big (mostly removing debug messages and some "if... return false" at the beginning). My Code:
public boolean isAttachmentClean(InputStream inputStream) throws IOException, JSONException, ServiceUnavailableException {
//prevent httpClient from retrying in case of an IOException
final HttpRequestRetryHandler retryHandler = new DefaultHttpRequestRetryHandler(0, false);
HttpClient httpClient = HttpClients.custom().setRetryHandler(retryHandler).build();
HttpPost httpPost = new HttpPost(serviceUrl);
httpPost.setHeader("accept", "application/json");
//set some more headers...
//set timeout for POST from OSGi Config
RequestConfig timeoutConfig = RequestConfig.custom()
.setConnectionRequestTimeout(serviceTimeout)
.setConnectTimeout(serviceTimeout)
.setSocketTimeout(serviceTimeout)
.build();
httpPost.setConfig(timeoutConfig);
//create request body data
String requestBody;
try {
requestBody = buildDataJson(inputStream);
} finally {
inputStream.close();
}
HttpEntity requestBodyEntity = new ByteArrayEntity(requestBody.getBytes("UTF-8"));
httpPost.setEntity(requestBodyEntity);
//Execute and get the response.
HttpResponse response = httpClient.execute(httpPost);
if (response.getStatusLine().getStatusCode() != HttpServletResponse.SC_OK){
httpPost.abort();
throw new ServiceUnavailableException("API not available, Response Code was "+ response.getStatusLine().getStatusCode());
}
HttpEntity entity = response.getEntity();
boolean result = false;
if (entity != null) {
InputStream apiResult = entity.getContent();
try {
// check the response from the API (Virus yes or no)
result = evaluateResponse(apiResult);
} finally {
apiResult.close();
}
}
return result;
}
"buildDataJson()" simply reads the InputStream and creates a JSON needed for the API call.
"evaluateResponse()" also reads the InputStream, transforms it into a JSON and checks for a property named "Status:" "Clean".
I'd appreciate any tipps on why this request is retried over and over again.
/edit: So far, I found that Apache httpClient has some default mechanism that retries a request in case of an IOException - which is what I get here. Still, I have not found a solution on how to deactivate these retries.
Related
In my Xpages application I am calling an external service to collect data.
Users are complaining that they sometimes get a timeout error message:
Connect to customerbank.acme.se:20543 [customerbank.acme.se/127.17.27.172] failed: Connection timed out: connect
I assumed the timeout would result in an IOException but apparently not. How can I catch this error?
Below is part of my code. The logic of handling the response I have left out.
private CloseableHttpClient httpclient;
try{
HttpClientBuilder cb = HttpClientBuilder.create();
RequestConfig requestConfig = RequestConfig.custom()
.setSocketTimeout(30 * 1000)
.setConnectTimeout(30 * 1000)
.setConnectionRequestTimeout(30 * 1000)
.build();
cb.setDefaultRequestConfig(requestConfig);
httpclient = cb.build();
HttpPost httpPost = new HttpPost(urlFromConfiguration);
httpPost.setHeader("Content-Type", "application/json");
HttpEntity entity;
entity = new ByteArrayEntity(JSONobj.toString().getBytes("UTF-8"));
httpPost.setEntity(entity);
CloseableHttpResponse response = httpclient.execute(httpPost);
if (200 == response.getStatusLine().getStatusCode()){//response received
//perform some logic with the response...
}
} catch (IOException e) {
OpenLogUtil.logError(e);
FacesContext.getCurrentInstance().addMessage(null, new javax.faces.application.FacesMessage(javax.faces.application.FacesMessage.SEVERITY_ERROR, "some IO exception occurred", ""));
} catch (Exception e) {
OpenLogUtil.logError(e);
FacesContext.getCurrentInstance().addMessage(null, new javax.faces.application.FacesMessage(javax.faces.application.FacesMessage.SEVERITY_ERROR, "some general error has occured" , ""));
}
I think this Baeldung page can help you:
"Note that the connection timeout will result in an
org.apache.http.conn.ConnectTimeoutException being thrown, while
socket timeout will result in a java.net.SocketTimeoutException."
Apache Http client that you are using is a great utility. But it could be a bit heavy and cumbersome for a relatively simple task that you are running. There is a much simpler Http client provided in MgntUtils Open source library (written by me). It may be not as comprehensive as Apache one, but is much simpler in use. It does throw IOException upon connection or time-out error. In your case it could be an alternative to use. Take a look at Javadoc. Library itself provided as Maven artifacts and on Git (including source code and Javadoc). All in all your code may look like this:
private static void testHttpClient() {
HttpClient client = new HttpClient();
client.setContentType("application/json");
String content = null;
try {
content = client.sendHttpRequest("http://yourUrl.com", HttpMethod.POST, JSONobj.toString());
//content holds the response. Do your logic here
} catch (IOException e) {
//Error Handling is here
content = TextUtils.getStacktrace(e, false);
}
}
This is a little weird and I have done close to enough research finding the cause and resolution of this issue. My objective is to download a zip file from a secured URL that also requires login.
Everything works perfect when I use apache httpClient maven dependency of version 4.3.6. However, I cannot use this version due to the fact that my aws-sdk-java-core maven dependency also has httpclient dependency and using v4.3.6 makes the aws-sdk-java complains about a NoSuchMethod runtime exception. I understand this issue. The reason is that apache httpclient v4.3.6 dependency is more nearest in maven dependency tree than the version (4.5.1) used by aws-sdk-java-core dependency. Anyway, I will cut down more details on this because I am pretty sure I should make everything work with one version of a maven dependency and not use multiple version of the same jar.
Back to original question. As I cannot use v4.3.6, I told my code to use v4.5.1 and thats when file download code started giving problems. When I use httpclient v4.5.1, the response gives me following html content rather than giving me the zip file I have on the requested https url.
<html>
<HEAD><META HTTP-EQUIV='PRAGMA' CONTENT='NO-CACHE'><META HTTP-EQUIV='CACHE-
CONTROL' CONTENT='NO-CACHE'>
<TITLE>SAML 2.0 Auto-POST form</TITLE>
</HEAD>
<body onLoad="document.forms[0].submit()">
<NOSCRIPT>Your browser does not support JavaScript. Please click the
'Continue' button below to proceed. <br><br>
</NOSCRIPT>
<form action="https://githubext.deere.com/saml/consume" method="POST">
<input type="hidden" name="SAMLResponse" value="PFJlc3BvbnNlIHhtbG5zPSJ1cm46b2FzaXM6bmFtZXM6dGM6U0FNTDoyLjA6cHJvdG9jb2wiIERl">
<input type="hidden" name="RelayState" value="2F1HpzrUy5FdX">
<NOSCRIPT><INPUT TYPE="SUBMIT" VALUE="Continue"></NOSCRIPT>
</form>
</body>
</html>
When I use v4.3.6, the response gives me zip file as expected response. I have tried manually submitting this html content by adding more code but the response remains intact.
The original code I have for file download is provided below.
#Component
public class FileDAO {
public static void main(String args[]) throws Exception{
new FileDAO().loadFile("https://some_url.domain.com/zipball/master","myfile.zip");
}
public String loadFile(String url, String fileName) throws ClientProtocolException, IOException {
HttpClient client = login();
HttpResponse response = client.execute(new HttpGet(url));
int statusCode = response.getStatusLine().getStatusCode();
if (statusCode == 200) {
String unzipToFolderName = fileName.replace(".", "_");
FileOutputStream outputStream = new FileOutputStream(new File(fileName));
writeToFile(outputStream, response.getEntity().getContent());
return unzipToFolderName;
} else {
throw new RuntimeException("error downloading file, HTTP Status code: " + statusCode);
}
}
private void writeToFile(FileOutputStream outputStream, InputStream inputStream) {
try {
int read = 0;
byte[] bytes = new byte[1024];
while ((read = inputStream.read(bytes)) != -1) {
outputStream.write(bytes, 0, read);
}
} catch (Exception ex) {
throw new RuntimeException("error writing zip file, error message : " + ex.getMessage(), ex);
} finally {
try {
outputStream.close();
inputStream.close();
} catch (Exception ex) {}
}
}
private HttpClient login() throws IOException {
HttpClient client = getHttpClient();
HttpResponse response = client.execute(new HttpGet("https://some_url.domain.com"));
String responseBody = EntityUtils.toString(response.getEntity());
Document doc = Jsoup.parse(responseBody);
org.jsoup.select.Elements inputs = doc.getElementsByTag("input");
int statusCode = response.getStatusLine().getStatusCode();
if (statusCode == 200) {
HttpPost httpPost = new HttpPost("https://some_url.domain.com/saml/consume");
List<NameValuePair> data = new ArrayList<NameValuePair>();
data.add(new BasicNameValuePair("SAMLResponse", doc.select("input[name=SAMLResponse]").val()));
data.add(new BasicNameValuePair("RelayState", doc.select("input[name=RelayState]").val()));
httpPost.setEntity(new UrlEncodedFormEntity(data));
HttpResponse logingResponse = client.execute(httpPost);
int loginStatusCode = logingResponse.getStatusLine().getStatusCode();
if (loginStatusCode != 302) {
throw new RuntimeException("clone repo dao. error during login, HTTP Status code: " + loginStatusCode);
}
}
return client;
}
private HttpClient getHttpClient() {
CredentialsProvider provider = new BasicCredentialsProvider();
UsernamePasswordCredentials credentials = new UsernamePasswordCredentials("userId", "password");
provider.setCredentials(AuthScope.ANY, credentials);
return HttpClientBuilder.create().setDefaultCredentialsProvider(provider).build();
}
}
I am still analyzing what is going wrong with apache httpclient versions other than 4.3.6. Same code works with 4.3.6 but not with version above 4.3.6.
Any help is really appreciated. Thank you all.
Issue resolved. After going through apache httpclient documentations with serious debugging the logs, I could resolve this issue. I had to create two server logs, one for v4.3.6 and other for v4.5.2. I started comparing the server logs and found that the culprit was cookie type. Cookie type in old version was (automatically) configured as BEST_MATCH and it was working. However, for v4.5.2, the BEST_MATCH cookie type has been deprecated from apache. I have been trying with cookie settings after adding some more code but the cookie sent by server response was not matching with the DEFAULT cookie type that I had configured in the client code. It was resulting in cookie not being setup properly and that’s why the response was returning back SAML response (login page again) instead of zip file.
Apache cookie spec says this for the cookie specifications:
Default: Default cookie policy is a synthetic policy that picks up either RFC 2965, RFC 2109 or Netscape draft compliant implementation based on properties of cookies sent with the HTTP response (such as version attribute, now obsolete). This policy will be deprecated in favor of the standard (RFC 6265 compliant) implementation in the next minor release of HttpClient.
Standard strict: State management policy compliant with the syntax and semantics of the well-behaved profile defined by RFC 6265, section 4.
I updated cookie configuration to STANDARD_STRICT mode and everything started working with the latest version 4.5.2.
Here is the updated getHttpClient() method:
private CloseableHttpClient getHttpClient() {
CredentialsProvider provider = new BasicCredentialsProvider();
UsernamePasswordCredentials credentials = new UsernamePasswordCredentials(gitUserId, gitPassword);
provider.setCredentials(AuthScope.ANY, credentials);
RequestConfig config = RequestConfig.custom().setCookieSpec(CookieSpecs.STANDARD_STRICT).build();
return HttpClientBuilder.create().setDefaultCredentialsProvider(provider).setDefaultRequestConfig(config).setRedirectStrategy(new LaxRedirectStrategy()).build();
}
In a Java, I want to send HttpPost every 5 secs without waiting for the response. How can I do that?
I use the following code:
HttpClient httpClient = new DefaultHttpClient();
HttpPost post = new HttpPost(url);
StringEntity params = new StringEntity(json.toString() + "\n");
post.addHeader("content-type", "application/json");
post.setEntity(params);
httpClient.execute(post);
Thread.sleep(5000);
httpClient.execute(post);
but it does not work.
Even though I lose the previous connection and set up a new connection to send the second, the second execute function is always blocked.
Your question leaves a bunch of questions, but the basic point of it can be achieved by:
while(true){ //process executes infinitely. Replace with your own condition
Thread.sleep(5000); // wait five seconds
httpClient.execute(post); //execute your request
}
I tried your code and I got the exception :
java.lang.IllegalStateException: Invalid use of BasicClientConnManager: connection still allocated.
Make sure to release the connection before allocating another one.
This exception is already logged in HttpClient 4.0.1 - how to release connection?
I was able to release the connection by consuming the response with the following code:
public void sendMultipleRequests() throws ClientProtocolException, IOException, InterruptedException {
HttpClient httpClient = new DefaultHttpClient();
HttpPost post = new HttpPost("http://www.google.com");
HttpResponse response = httpClient.execute(post);
HttpEntity entity = response.getEntity();
EntityUtils.consume(entity);
Thread.sleep(5000);
response = httpClient.execute(post);
entity = response.getEntity();
EntityUtils.consume(entity);
}
Using DefaultHttpClient is synchronous which means that program is blocked waiting for the response. Instead of that you could use async-http-client library to perform asynchronous requests (you can download jar files from search.maven.org if you're not familiar with Maven). Sample code may look like:
import com.ning.http.client.*; //imports
try {
AsyncHttpClient asyncHttpClient = new AsyncHttpClient();
while(true) {
asyncHttpClient
.preparePost("http://your.url/")
.addParameter("postVariableName", "postVariableValue")
.execute(); // just execute request and ignore response
System.out.println("Request sent");
Thread.sleep(5000);
}
} catch (Exception e) {
System.out.println("oops..." + e);
}
I have a secure site that needs to display images coming from external non-https URLs on certain pages. I want to create a servlet that is used only as a proxy to pass the image data to the pages. One way is to use Apache's HttpClient to download the image data and then use IOUtils.copy to copy the data to the servlet's response.
Is there a simpler way?
UPDATE: The reason for this is to avoid browser warnings.
This is what I ended up using:
protected void doGet(HttpServletRequest request,
HttpServletResponse response) throws ServletException, IOException {
try {
String url = request.getParameter("url");
HttpClient httpClient = new DefaultHttpClient();
HttpGet httpGet = new HttpGet(url);
HttpResponse httpResponse = httpClient.execute(httpGet);
HttpEntity httpEntity = httpResponse.getEntity();
InputStream inputStream = httpEntity.getContent();
response.setContentType("image/jpeg");
IOUtils.copy(inputStream, response.getOutputStream());
} catch (Exception e) {
AppLogger.log(e);
}
}
If anyone has a better way to accomplish this, please post it.
If I understand well, you don't need anything like that, just return the references to the images or audio, or anything else in your HTML response and the browser will take care of make request to the server that contains each of the resources, if they're reachable, they will be displayed on the client.
Does anyone have a good tutorial on how to write a java or javafx cURL application? I have seen tons of tutorials on how to initiate an external call to say like an XML file, but the XML feed I am trying to retrieve calls for you to submit the username and password before being able to retrieve the XML feed.
What are you trying to accomplish? Are you trying to retrieve a XML feed over HTTP?
In that case I suggest you to take a look at Apache HttpClient. It offers similair functionality as cURL but in a pure Java way (cURL is a native C application). HttpClient supports multiple authentication mechanisms. For example you can submit a username/password using Basic Authentication like this:
public static void main(String[] args) throws Exception {
DefaultHttpClient httpclient = new DefaultHttpClient();
httpclient.getCredentialsProvider().setCredentials(
new AuthScope("localhost", 443),
new UsernamePasswordCredentials("username", "password"));
HttpGet httpget = new HttpGet("https://localhost/protected");
System.out.println("executing request" + httpget.getRequestLine());
HttpResponse response = httpclient.execute(httpget);
HttpEntity entity = response.getEntity();
System.out.println("----------------------------------------");
System.out.println(response.getStatusLine());
if (entity != null) {
System.out.println("Response content length: " + entity.getContentLength());
}
if (entity != null) {
entity.consumeContent();
}
// When HttpClient instance is no longer needed,
// shut down the connection manager to ensure
// immediate deallocation of all system resources
httpclient.getConnectionManager().shutdown();
}
Check the website for more examples.