CloseableHttpAsyncClient terminates with ConnectionClosedException: Connection closed unexpectedly - java

I am working on a file downloader that submits get requests for around thousand files. I came across this article that would aid in submitting a lot of requests using the executor framework. I tried running a smaller number of files (around a hundred), it was working. However, the large number of files that I ran resulted in ConnectionClosedException.
This is the download code that submits the requests:
void download(String sObjname, List<FileMetadata> blobList) throws IOException, InterruptedException
{
long totalSize = 0;
this.sObjname = sObjname;
for (FileMetadata doc : blobList)
{
totalSize += doc.getSize();
doc.setStatus(JobStatus.INIT_COMPLETE);
}
totalFileSize = new AtomicLong(totalSize);
// Async client definiton; MAX_CONN around 5-15
try (CloseableHttpAsyncClient httpclient = HttpAsyncClients.custom().setMaxConnPerRoute(MAX_CONN)
.setMaxConnTotal(MAX_CONN).build())
{
httpclient.start();
// Define the callback for handling the response and marking the status
FutureCallback<String> futureCallback = new FutureCallback<String>() {
#Override
public void cancelled()
{
logger.error("Task cancelled in the rest client.");
shutdownLatch.countDown();
}
#Override
public void completed(String docPath)
{
FileMetadata doc = futureMap.get(docPath);
logger.info(doc.getPath() + " download completed");
totalFileSize.addAndGet(-1 * doc.getSize());
doc.setStatus(JobStatus.WRITE_COMPLETE);
shutdownLatch.countDown();
}
#Override
public void failed(Exception e)
{
shutdownLatch.countDown();
logger.error("Exception caught under failed for " + sObjname + " " + e.getMessage(), e);
Throwable cause = e.getCause();
if (cause != null && cause.getClass().equals(ClientProtocolException.class))
{
String message = cause.getMessage();
// TODO Remove this
logger.error("Cause message: " + message);
String filePath = message.split("Unable to download the file ")[1].split(" ")[0];
futureMap.get(filePath).setStatus(JobStatus.WRITE_FAILED);
}
}
};
// Submit the get requests here
String folderPath = SalesforceUtility.getFolderPath(sObjname);
new File(new StringBuilder(folderPath).append(File.separator).append(Constants.FILES).toString()).mkdir();
String body = (sObjname.equals(Constants.contentVersion)) ? "/VersionData" : "/body";
shutdownLatch = new CountDownLatch(blobList.size());
for (FileMetadata doc : blobList)
{
String uri = baseUri + "/sobjects/" + sObjname + "/" + doc.getId() + body;
HttpGet httpGet = new HttpGet(uri);
httpGet.addHeader(oauthHeader);
doc.setStatus(JobStatus.WRITING);
// Producer definition
HttpAsyncRequestProducer producer = HttpAsyncMethods.create(httpGet);
// Consumer definition
File docFile = new File(doc.getPath());
HttpAsyncResponseConsumer<String> consumer = new ZeroCopyConsumer<String>(docFile) {
#Override
protected String process(final HttpResponse response, final File file,
final ContentType contentType) throws Exception
{
if (response.getStatusLine().getStatusCode() != HttpStatus.SC_OK)
{
throw new ClientProtocolException("Unable to download the file " + file.getAbsolutePath()
+ ". Error code: " + response.getStatusLine().getStatusCode() + "; Error message: "
+ response.getStatusLine());
}
return file.getAbsolutePath();
}
};
// Execute the request
logger.info("Submitted download for " + doc.getPath());
httpclient.execute(producer, consumer, futureCallback);
futureMap.put(doc.getPath(), doc);
}
if (futureMap.size() > 0)
schedExec.scheduleAtFixedRate(timerRunnable, 0, 5, TimeUnit.MINUTES);
logger.debug("Waiting for download results for " + sObjname);
shutdownLatch.await();
}
finally
{
schedExec.shutdown();
schedExec.awaitTermination(24, TimeUnit.HOURS);
logger.debug("Finished downloading files for " + sObjname);
}
}
The stacktrace that I received was:
org.apache.http.ConnectionClosedException: Connection closed unexpectedly
at org.apache.http.nio.protocol.HttpAsyncRequestExecutor.closed(HttpAsyncRequestExecutor.java:139) [httpcore-nio-4.4.4.jar:4.4.4]
at org.apache.http.impl.nio.client.InternalIODispatch.onClosed(InternalIODispatch.java:71) [httpasyncclient-4.1.1.jar:4.1.1]
at org.apache.http.impl.nio.client.InternalIODispatch.onClosed(InternalIODispatch.java:39) [httpasyncclient-4.1.1.jar:4.1.1]
at org.apache.http.impl.nio.reactor.AbstractIODispatch.disconnected(AbstractIODispatch.java:102) [httpcore-nio-4.4.4.jar:4.4.4]
at org.apache.http.impl.nio.reactor.BaseIOReactor.sessionClosed(BaseIOReactor.java:281) [httpcore-nio-4.4.4.jar:4.4.4]
at org.apache.http.impl.nio.reactor.AbstractIOReactor.processClosedSessions(AbstractIOReactor.java:442) [httpcore-nio-4.4.4.jar:4.4.4]
at org.apache.http.impl.nio.reactor.AbstractIOReactor.execute(AbstractIOReactor.java:285) [httpcore-nio-4.4.4.jar:4.4.4]
at org.apache.http.impl.nio.reactor.BaseIOReactor.execute(BaseIOReactor.java:106) [httpcore-nio-4.4.4.jar:4.4.4]
at org.apache.http.impl.nio.reactor.AbstractMultiworkerIOReactor$Worker.run(AbstractMultiworkerIOReactor.java:590) [httpcore-nio-4.4.4.jar:4.4.4]
at java.lang.Thread.run(Unknown Source) [?:1.8.0_72]
for a number of workers.

Thanks to #lucasvc, the default behaviour is explained here. Pertaining to my solution, the code was updated to the following and the issue did not appear.
IOReactorConfig reactorConfig = IOReactorConfig.custom()
.setConnectTimeout(TIMEOUT_5_MINS_IN‌​_MILLIS)
.setSoTimeout(TIMEOUT_5_MINS_IN_MILLIS).build();
try (CloseableHttpAsyncClient asyncClient = HttpAsyncClients.custom()
.setDefaultIOReactorConfig(reactorC‌​onfig)
.setDefaultHeaders(Collections.singletonList(oauthHeader‌​))
.setMaxConnPerRout‌​e(MAX_CONN)
.setMaxConnTotal(MAX_CONN).build();) {
// ...
}

Related

Uploading image from ESP32 Cam to Spring boot using Multipart

I am currently trying to upload an image captured on an ESP32 Cam board to a Spring server via HTTP POST. However, I keep getting an error saying
Required request part 'image' is not present
I have tried multiple solutions across the web but with no success. Here's the arduino code:
String serverUrl = "192.168.100.4";
String serverPath = "/identify";
const int serverPort = 8080;
String contentLengthStr = String("Content-Length=");
String crLf = String("\r\n");
String bodyStart = "–-claudiu\r\nContent-Disposition: form-data; name=\"image\"; filename=\"esp32-cam.jpg\"\r\nContent-Type: image/jpeg\r\n\r\n";
String bodyEnd = "\r\n–-claudiu–-\r\n";
String sendPhoto() {
camera_fb_t * frameBuffer = NULL;
frameBuffer = esp_camera_fb_get();
if(!frameBuffer) {
Serial.println("Camera capture failed");
delay(1000);
ESP.restart();
}
Serial.println("Connecting to server: " + serverUrl);
if (client.connect(serverUrl.c_str(), serverPort)) {
Serial.println("Connection successful!");
uint32_t imageLength = frameBuffer->len;
uint32_t contentLength = bodyStart.length() + imageLength + bodyEnd.length();
Serial.print("Content length: ");
Serial.println(String(contentLength));
Serial.println("Posting image");
client.println("POST " + serverPath + " HTTP/1.1");
client.println("Host: " + serverUrl);
client.println("Content-Length: " + String(contentLength));
client.println("Content-Type: multipart/form-data; boundary=claudiu\r\n");
client.print(bodyStart);
uint8_t *fbBuf = frameBuffer->buf;
size_t fbLen = frameBuffer->len;
for (size_t n=0; n<fbLen; n=n+1024) {
if (n+1024 < fbLen) {
client.write(fbBuf, 1024);
fbBuf += 1024;
}
else if (fbLen%1024>0) {
size_t remainder = fbLen%1024;
client.write(fbBuf, remainder);
}
}
client.print(bodyEnd);
esp_camera_fb_return(frameBuffer);
Serial.println("Response:");
while (client.connected()) {
if (client.available()) {
String response = client.readStringUntil('\n');
Serial.println(response.c_str());
}
}
client.stop();
delay(50);
esp_camera_fb_return(frameBuffer);
}
else {
Serial.println("Connection to " + serverUrl + " failed.");
}
return "DONE";
}
My Spring Controller looks like this:
#RestController
public class RequestController {
#PostMapping(value = "/identify", consumes = "multipart/form-data")
public String identify(#RequestParam("imageFile") MultipartFile imageFile) throws IOException {
System.out.println(imageFile.getOriginalFilename());
FileUtil.saveFile("/images/", imageFile.getOriginalFilename(), imageFile);
return "WORKS";
}
}
The connection is successful but I can not get a proper response from the server.
Could you please help me find the problem? I have been looking for the past few hours but I can't find anything.
I am not familiar with arduino or embedded systems code at all but I have some spring boot experience try something like this on the api resource code I hope it helps and let me know if it works
#RestController
public class RequestController {
#PostMapping(value = "/identify", consumes = "multipart/form-data")
public ResponseEntity<String> identify(#RequestParam("imageFile") MultipartFile
imageFile) throws IOException {
System.out.println(imageFile.getOriginalFilename());
FileUtil.saveFile("/images/", imageFile.getOriginalFilename(), imageFile);
return ResponseEntity.ok("WORKS");
}
}

Google App Engine Exception - Current thread is not associated with any request and is not a background thread

I am trying to hit google analytics through GAE APIs from a cross platform Java desktop application. I followed sample code from github but due to threading problem the request isn't being executed.
https://github.com/GoogleCloudPlatform/appengine-googleanalytics-java/blob/master/src/main/java/com/google/appengine/analytics/tracking/GoogleAnalyticsTracking.java
Here is my sample code:
ThreadManager.currentRequestThreadFactory().newThread(new Runnable() {
#Override
public void run() {
try {
GoogleAccountTracker.trackEventToGoogleAnalytics("category1", "action1", "label1", "value1");
} catch (IOException e) {
e.printStackTrace();
}
}
});
public static int trackEventToGoogleAnalytics(String category, String action, String label, String value) throws IOException {
Application.LOGGER.info("trackEventToGoogleAnalytics() started");
Map<String, String> map = new LinkedHashMap<>();
map.put("v", "1"); // Version.
map.put("tid", gaTrackingId);
map.put("cid", gaClientId);
map.put("t", "event"); // Event hit type.
map.put("ec", encode(category, true));
map.put("ea", encode(action, true));
map.put("el", encode(label, false));
map.put("ev", encode(value, false));
HTTPRequest request = new HTTPRequest(GA_URL_ENDPOINT, HTTPMethod.POST);
request.addHeader(CONTENT_TYPE_HEADER);
request.setPayload(getPostData(map));
HTTPResponse response = urlFetchService.fetch(request);
byte[] content = response.getContent();
URL finalUrl = response.getFinalUrl();
// 200, 404, 500, etc
int responseCode = response.getResponseCode();
List<HTTPHeader> headers = response.getHeaders();
for(HTTPHeader header : headers) {
String headerName = header.getName();
String headerValue = header.getValue();
Application.LOGGER.info(headerName + " " + headerValue);
}
Application.LOGGER.info("trackEventToGoogleAnalytics() ended");
return 0; //httpResponse.getResponseCode();
}
The exception I am getting:
Caused by: java.lang.NullPointerException: Current thread is not associated with any request and is not a background thread
at com.google.appengine.api.ThreadManager.getCurrentEnvironmentOrThrow(ThreadManager.java:106)
at com.google.appengine.api.ThreadManager.currentRequestThreadFactory(ThreadManager.java:47)

Java FTP using Apache commons throws "IOException caught while copying"

I have made a JavaFX application which includes uploading large files (> 1GB) to a server. Every time I got the same error on same place. Any Suggestions what am I doing wrong here.
ftpclient.connect(server, port);
ftpclient.login(ftpuser, ftppass);
ftpclient.enterLocalPassiveMode();
ftpclient.setKeepAlive(true);
ftpclient.setControlKeepAliveTimeout(3000);
Task<Void> copyMnt = new Task<Void>() {
#Override
protected Void call(){
try {
new Thread(new FTPHandler(ftpclient, source , dest)).run();
} catch (IOException ex) {
Logger.getLogger(MyAppController.class.getName()).log(Level.SEVERE, null, ex);
}
return null;
}
};
new Thread(copyMnt).start();
Now on the FTPHandler Class
// The constructor will set the ftpclient, source and destinations.
#Override
public void run() {
try {
uploadDirectory(this.getClient(), this.getDest(), this.getSrc(), "");
} catch (IOException ex) {
Logger.getLogger(FTPHandler.class.getName()).log(Level.SEVERE, null, ex);
}
}
public static void uploadDirectory(FTPClient ftpClient,
String remoteDirPath, String localParentDir, String remoteParentDir)
throws IOException {
File localDir = new File(localParentDir);
File[] subFiles = localDir.listFiles();
if (subFiles != null && subFiles.length > 0) {
for (File item : subFiles) {
String remoteFilePath = remoteDirPath + "/" + remoteParentDir
+ "/" + item.getName();
if (remoteParentDir.equals("")) {
remoteFilePath = remoteDirPath + "/" + item.getName();
}
if (item.isFile()) {
// upload the file
String localFilePath = item.getAbsolutePath();
java.util.Date date= new java.util.Date();
System.out.println(new Timestamp(date.getTime()) + " : Uploading :: " + localFilePath + " to " + remoteFilePath);
boolean uploaded = uploadSingleFile(ftpClient,
localFilePath, remoteFilePath);
if (uploaded) {
System.out.println("Success : "
+ remoteFilePath);
} else {
System.out.println("Failed : "
+ localFilePath);
}
} else {
// create directory on the server
boolean created = ftpClient.makeDirectory(remoteFilePath);
if (created) {
System.out.println("CREATED the directory: "
+ remoteFilePath);
} else {
System.out.println("COULD NOT create the directory: "
+ remoteFilePath);
}
// upload the sub directory
String parent = remoteParentDir + "/" + item.getName();
if (remoteParentDir.equals("")) {
parent = item.getName();
}
localParentDir = item.getAbsolutePath();
uploadDirectory(ftpClient, remoteDirPath, localParentDir,
parent);
}
}
}
}
Every time I am uploading the files (files are of different types like .iso, .dat etc), The first few files (upload sequence is like first few hundred files are smaller i.e less than few MBs then Last 10 files are more than 1 GB big)will be successfully uploaded (i.e all smaller files and 2 of the last 10 files) but when it starts uploading the third big file I get following exception.
SEVERE: null
org.apache.commons.net.io.CopyStreamException: IOException caught while copying.
at org.apache.commons.net.io.Util.copyStream(Util.java:134)
at org.apache.commons.net.ftp.FTPClient._storeFile(FTPClient.java:653)
at org.apache.commons.net.ftp.FTPClient.__storeFile(FTPClient.java:624)
at org.apache.commons.net.ftp.FTPClient.storeFile(FTPClient.java:1976)
The CopyStreamException has a "cause" exception. Check that using the .getCause(), to see what went wrong.
See the Util.copyStream method:
public static final long copyStream(InputStream source, OutputStream dest,
int bufferSize, long streamSize,
CopyStreamListener listener,
boolean flush)
throws CopyStreamException
{
int bytes;
long total = 0;
byte[] buffer = new byte[bufferSize >= 0 ? bufferSize : DEFAULT_COPY_BUFFER_SIZE];
try
{
while ((bytes = source.read(buffer)) != -1)
{
....
}
}
catch (IOException e)
{
throw new CopyStreamException("IOException caught while copying.",
total, e);
}
return total;
}
Somewhere in your uploadSingleFile function, do
try
{
ftpClient.storeFile(...)
}
catch (Exception e)
{
e.printStackTrace();
if (e.getCause() != null)
{
e.getCause().printStackTrace();
}
}
I do not know Java, so the code may not be 100% correct.
See also Getting full string stack trace including inner exception.

posting data to a servlet using httpclient

I'm trying to post 2 fields, id and data, to a servlet using HttpClient.
The problem is that if the length of the data field is less than 1MB or so, the servlet will get what I posted. But if the length of the data field is larger than 1MB or so, the servlet will receive null for all fields. What am I missing here? Thanks.
Here's the sample data that I post to the servlet:
id=12312123123123
data=the content of a file that is base-64 encoded
Here's the method that I use to post data to the servlet.
private byte[] post(String aUrl,
Map<String,String> aParams,
String aCharsetEnc,
int aMaxWaitMs) throws Exception
{
PostMethod post = null;
try
{
HttpClient client = new HttpClient();
post = new PostMethod(aUrl);
post.setRequestHeader("Content-Type", "application/x-www-form-urlencoded; charset=" + aCharsetEnc);
for (String key : aParams.keySet())
{
post.addParameter(key, aParams.get(key));
}
final int code = client.executeMethod(post);
if (code == HttpStatus.SC_NO_CONTENT || code == HttpStatus.SC_NOT_FOUND)
{
return null;
}
else if (code != HttpStatus.SC_OK)
{
throw new HttpException("Error code " + code + " encountered.");
}
InputStream stream = post.getResponseBodyAsStream();
if (stream != null)
{
return BlobHelper.readBytes(stream);
}
return null;
}
finally
{
if (post != null)
{
post.releaseConnection();
}
}
}
Here's the method of the servlet.
public void doPost(HttpServletRequest aReq, HttpServletResponse aResp)
throws ServletException, IOException
{
setNoCache(aResp);
aResp.setContentType("text/plain");
try
{
final String id = aReq.getParameter(PARAM_ID);
final String dataStr = aReq.getParameter(PARAM_DATA);
if (log().isDebugEnabled())
{
log().debug("id=" + id);
log().debug("data=" + dataStr);
}
}
catch (Exception e)
{
}
}
Usually servlet containers have a maximum post size parameter.
For Tomcat you can follow the steps documented here(they should be similar for other appservers) -
Is there a max size for POST parameter content?

Crawl all the links of a page that is password protected

I am crawling a page that requires username and password for authentication. And I successfully got the 200 OK response back from the server for that page when I passed my username and password in the code. But it gets stop as soon as it gives the 200 OK response back. It doesn't move forward in to that page after authentication to crawl all those links that are there in that page. And this crawler is taken from http://code.google.com/p/crawler4j/.
This is the code where I am doing the authentication stuff...
public class MyCrawler extends WebCrawler {
Pattern filters = Pattern.compile(".*(\\.(css|js|bmp|gif|jpe?g"
+ "|png|tiff?|mid|mp2|mp3|mp4" + "|wav|avi|mov|mpeg|ram|m4v|pdf"
+ "|rm|smil|wmv|swf|wma|zip|rar|gz))$");
List<String> exclusions;
public MyCrawler() {
exclusions = new ArrayList<String>();
//Add here all your exclusions
exclusions.add("http://www.dot.ca.gov/dist11/d11tmc/sdmap/cameras/cameras.html");
}
public boolean shouldVisit(WebURL url) {
String href = url.getURL().toLowerCase();
DefaultHttpClient client = null;
try
{
System.out.println("----------------------------------------");
System.out.println("WEB URL:- " +url);
client = new DefaultHttpClient();
client.getCredentialsProvider().setCredentials(
new AuthScope(AuthScope.ANY_HOST, AuthScope.ANY_PORT, AuthScope.ANY_REALM),
new UsernamePasswordCredentials("test", "test"));
client.getParams().setParameter(ClientPNames.ALLOW_CIRCULAR_REDIRECTS, true);
for(String exclusion : exclusions){
if(href.startsWith(exclusion)){
return false;
}
}
if (href.startsWith("http://") || href.startsWith("https://")) {
return true;
}
HttpGet request = new HttpGet(url.toString());
System.out.println("----------------------------------------");
System.out.println("executing request" + request.getRequestLine());
HttpResponse response = client.execute(request);
HttpEntity entity = response.getEntity();
System.out.println(response.getStatusLine());
}
catch(Exception e) {
e.printStackTrace();
}
return false;
}
public void visit(Page page) {
System.out.println("hello");
int docid = page.getWebURL().getDocid();
String url = page.getWebURL().getURL();
System.out.println("Page:- " +url);
String text = page.getText();
List<WebURL> links = page.getURLs();
int parentDocid = page.getWebURL().getParentDocid();
System.out.println("Docid: " + docid);
System.out.println("URL: " + url);
System.out.println("Text length: " + text.length());
System.out.println("Number of links: " + links.size());
System.out.println("Docid of parent page: " + parentDocid);
}
}
And this is my Controller class
public class Controller {
public static void main(String[] args) throws Exception {
CrawlController controller = new CrawlController("/data/crawl/root");
//And I want to crawl all those links that are there in this password protected page
controller.addSeed("http://search.somehost.com/");
controller.start(MyCrawler.class, 20);
controller.setPolitenessDelay(200);
controller.setMaximumCrawlDepth(2);
}
}
Anything wrong I am doing....
As described in http://code.google.com/p/crawler4j/ the shoudVisit() function should only return true or false. But in your code, this function is also fetching the content of the page which is wrong. The current version of crawler4j (3.0) doesn't support crawling of password-protected pages.

Categories