URL set Connection Timeout is not working - java

I am using rss feeds to get latest news
and i get a XML response back
The issue i am facng is that in case if it takes longer than 5 seconds i just want the program to be stopped
this is my code (for testing purpose i have set time to 1 second)
public static void main(String args[]) {
Connection dbConnection = null;
PreparedStatement inserpstmt = null;
try {
final JSONArray latestnews = new JSONArray();
builder = getDocumentBuilderInstance();
final URL url = new URL("http://www.rssmix.com/u/8171434/rss.xml");
url.openConnection().setConnectTimeout(1000);
url.openConnection().setReadTimeout(1000);
final Document doc = builder.parse(url.openStream());
final NodeList items = doc.getElementsByTagName("item");
for (int i = 0; i < items.getLength(); i++) {
final Element item = (Element) items.item(i);
final String title = getValue(item, "title");
System.out.println(title);
}
} catch (Exception e) {
e.printStackTrace();
e.getMessage();
} catch (Throwable e) {
e.getMessage();
e.printStackTrace();
} finally {
}
}
But could you please let me know why this isn't being stopped and waiting for more than 1 second
Edited Code
StringBuffer sb = new StringBuffer("http://www.rssmix.com/u/8171434/rss.xml");
URLConnection conn = new URL(sb.toString()).openConnection();
conn.setConnectTimeout(7000);
conn.setReadTimeout(7000);
final Document doc = builder.parse(new InputSource(conn.getInputStream()));

You should probably approach this in the following fashion.
final URL url = new URL("http://www.rssmix.com/u/8171434/rss.xml");
URLConnection urlConn = url.openConnection();
urlConn.setConnectTimeout(1000);
urlConn.setReadTimeout(1000);
final Document doc = builder.parse(urlConn.getInputStream());

In the above code each time when you call openConnection, you get a new Connection object. Also finally you are using openStream which is equivalent to openConnection().getInputStream(). So all the timeouts are on different connection object and finally there is no timeout set on the connection object from where inputstream is taken. That is why it was not working. Below code will work as timeouts are present on same object from where InputStream is retrieved.
URLConnection connection = url.openConnection();
connection.setConnectTimeout(1000);
connection.setReadTimeout(1000);
connection.connect();

Related

Processing large input Stream to List String in Java

Hey I am having a file nearly 110MB size at apache. I am reading that file into input stream and then converting that input stream to List of String based on all suggestion i find on stack overflow. But still i am facing out of memory issue.
Below is my code.
private List<String> readFromHttp(String url, PlainDiff diff) throws Exception {
HttpUrlConnection con = new HttpUrlConnection();
con.setGetUrl(url);
List<String> lines = new ArrayList<String>();
final String PREFIX = "stream2file";
final String SUFFIX = ".tmp";
final File tempFile = File.createTempFile(PREFIX, SUFFIX);
tempFile.deleteOnExit();
StringBuilder sb = new StringBuilder();
try {
InputStream data = con.sendGetInputStream();
if(data==null)
throw new UserAuthException("diff is not available at the location");
else {
try (FileOutputStream out = new FileOutputStream(tempFile)) {
IOUtils.copy(data, out);
LineIterator it = FileUtils.lineIterator(tempFile, "UTF-8");
try {
while (it.hasNext()) {
String line = it.nextLine();
lines.add(line);
sb.append(line);
}
} finally {
LineIterator.closeQuietly(it);
}
}
data.close();
diff.setLineAsString(sb.toString());
}
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
//System.out.println(lines);
return lines;
}
public InputStream sendGetInputStream() throws IOException {
String encoding = Base64.getEncoder().encodeToString(("abc:$xyz$").getBytes("UTF-8"));
URL obj = new URL(getGetUrl());
// Setup the connection
HttpURLConnection con = (HttpURLConnection) obj.openConnection();
// Set the parameters from the headers
con.setRequestMethod("GET");
con.setDoOutput(true);
con.setRequestProperty ("Authorization", "Basic " + encoding);
InputStream is;
int responseCode = con.getResponseCode();
logger.info("GET Response Code :: " + responseCode);
if (responseCode == HttpURLConnection.HTTP_OK) {
is = con.getInputStream();
}
else {
is = null;
}
return is;
}
Is something in memory i am doing that is consuming lot of heap? Is there a better way to do it?
Your code has multiple issues. I am not going to solve each and every issue but point that out so that you can review your code and learn to write better code.
In method readFromHttp(..):
There is no need to create a new file by IOUtils.copy(data, out);
No use of String Builder StringBuilder sb = new StringBuilder();
No use of line iterator LineIterator
And there are multiple other memory-related issues but for the time being correct these points and test with the below-mentioned code.
Change your reading lines from file to very simple way after correcting the above mistakes:
try(BufferedReader reader = new BufferedReader(new InputStreamReader(data, StandardCharsets.UTF_8))) {
for (String line; (line = reader.readLine()) != null;) {
lines.add(line);
}
}

java how to force refresh on url connection

I am behind proxy server and I am using the following code to get data from url:
private static String getData(String address)throws Exception
{
System.setProperty("java.net.useSystemProxies","true");
Date d = new Date();
String finalAdress = address+"?x="+d.getTime();
URL url = new URL(finalAdress);
System.out.println(finalAdress);
InputStream html = null;
HttpsURLConnection con = (HttpsURLConnection)url.openConnection();
con.setUseCaches(false);
con.setDefaultUseCaches(false);
con.setRequestMethod("GET");
con.addRequestProperty("Cache-Control", "no-cache");
//con.addRequestProperty("Cache-Control", "max-age=0");
con.addRequestProperty("Expires", "0");
con.addRequestProperty("Pragma", "no-cache");
html = con.getInputStream();
//html = url.openStream();
int c = 0;
StringBuffer buffer = new StringBuffer("");
while(c != -1) {
c = html.read();
buffer.append((char)c);
}
return buffer.toString();
}
However when data changes on the server side - I sill for some time get the same old (cached) data as a response..
Tried to use:
-Cache-Control
-slightly modified urls address+"?x="+d.getTime();
but nothing seems to work.
Is there a way to force refresh as I would with a web browser (ctrl-F5) ?

Java, Selenium: Verify URLs using HTTP Get - How long should it take

I have a test suite made up of 10 testcases. Each test case navigates to a section of the website and checks all the url's in that section. Each section has roughly 10 webpages and each webpage has over 100 href elements.
The total execution time is between 50 mins and 1 hour. How long should this take? 1 hr seems a bit excessive.
Helper method:
public Map<Boolean, List<String>> getHrefResponseBoolean() {
return driver.findElements(By.xpath("//*[#href]"))
.stream()
.filter(s -> !s.getAttribute("href").endsWith("svg") && !s.getAttribute("href").endsWith("webmanifest") && !s.getAttribute("href").endsWith("ico?v=2"))
.map(ele -> ele.getAttribute("href"))
.map(String::trim)
.distinct()
.collect(Collectors.partitioningBy(link ->
HttpUtility.getResponseCode(link) == 200))
;
HttpUtility:
/**
* Hits the given url and returns the HTTP response code
* #param link
* #return
*/
public static int getResponseCode(String link) {
URL url;
HttpURLConnection con = null;
Integer responsecode = 0;
try {
url = new URL(link);
con = (HttpURLConnection) url.openConnection();
responsecode = con.getResponseCode();
} catch (Exception e) {
// skip
} finally {
if (null != con)
con.disconnect();
}
return responsecode;
}
First of all, you shoul set timeout to connection. This is going to avoid con stuck.
public static int getResponseCode(String link) {
URL url;
HttpURLConnection con = null;
Integer responsecode = 0;
try {
url = new URL(link);
con = (HttpURLConnection) url.openConnection();
con.setConnectTimeout(1000); //set time out
responsecode = con.getResponseCode();
} catch (Exception e) {
// skip
} finally {
if (null != con)
con.disconnect();
}
return responsecode;
}
Now, i beleave you should filter the WebElements through the xpath. I know it looks messy, but that's for xpath compatibility with version 1.
public Map<Boolean, List<String>> getHrefResponseBoolean() {
WebDriver driver = new FirefoxDriver();
return driver.findElements(By.xpath("//*[not('svg' = substring(#href,string-length(#href) -string-length('svg') +1)) " +
" and" +
" not('webmanifest' = substring(#href,string-length(#href) -string-length('webmanifest') +1))" +
" and" +
" not('ico?v=2' = substring(#href,string-length(#href) -string-length('ico?v=2') +1))" +
"]"))
.stream()
.map(ele -> ele.getAttribute("href").trim())
.distinct()
.collect(Collectors.partitioningBy(link -> getResponseCode(link) == 200));
}
and then simply we could access the map to list all the bad urls as shown here.
map.get(true) // will contain all the good urls
map.get(false) // will contain all the bad urls

I cant insert data from a java program to laravel app via api

I am new to laravel and I would like to save data to my online server via laravel api from a java program but I am getting errors.
this is my route on api.php:
Route::middleware('auth:api')->get('/user', function (Request $request) {
return $request->user();
});
Route::post('hooks','ApiTestController#store');
my ApiTestController: its just handles POST request then saves to the table.
public function store(Request $request)
{
$postdata = json_decode($request->input('post_data'), true);
$datas = $postdata['header'];
$data = $datas[0];
$testH = new TestH();
$testH->test_date = $data['test_date'];
$testH->expiration = $data['test_date'];
$testH->source = $data['source'];
$testH->save();
return $testH;
}
and my java code :
try {
//local development server url
URL url = new URL("http://127.0.0.1:8000/api/hooks");
URLConnection con = url.openConnection();
// activate the output
con.setDoOutput(true);
PrintStream ps = new PrintStream(con.getOutputStream());
//create the JSON String
String json = null;
StringWriter sw = new StringWriter();
JSONWriter wr = new JSONWriter(sw);
try {
wr.object().key("header").array();
wr.object();
wr.key("test_date").value(new Date());
wr.key("source").value("TEST");
wr.key("expiration").value(new Date());
wr.endObject();
wr.endArray().endObject();
json = sw.toString();
System.out.println(json);
} catch (JSONException ex) {
Logger.getLogger(WebConnectSample.class.getName()).log(Level.SEVERE, null, ex);
}
// send to laravel server
ps.print("post_data="+json);
HttpURLConnection httpConn = (HttpURLConnection) con;
InputStream is;
if (httpConn.getResponseCode() >= 419) {
is = httpConn.getErrorStream();
} else {
is = httpConn.getInputStream();
}
// read the server reply
BufferedReader in = new BufferedReader(new InputStreamReader(con.getInputStream()));
String line = null;
while ((line = in.readLine()) != null) {
System.out.println(line);
// close the print stream
}
ps.close();
} catch (Exception e) {
e.printStackTrace();
}
}
the thing is when I dont save via $testH->save() everything works fine. but if I include it java returns with the following error:
Type error: Argument 1 passed to Illuminate\Routing\Middleware\ThrottleRequests::addHeaders() must be an instance of Symfony\Component\HttpFoundation\Response, string given, called in C:\Users\relixusdev\Documents\WebProjects\tcmsite\vendor\laravel\framework\src\Illuminate\Routing\Middleware\ThrottleRequests.php on line 61
any idea what part causes the error? does it have to do with authentication? i just want to be able to save to the online database via my java program.
Try using Route group with prefix as below
Route::group(['prefix' => 'api'], function() {
Route::post('hooks','ApiTestController#store');
});
if anyone comes here having the same problem, i found out that the problem is that I dont have the created_at and updated_at column at my table. I didn't realized its a requirement for laravel. silly me.

Issue with Retrieving JSON from Webpage Android

So I'm facing some difficulty in trying to, what seems simply, obtain a JSON file from a webpage, and then parse it on Android. I have already built the parser, and tested it in Eclipse (in fact, all of the code works in Eclipse). However, when I run the HttpURLConnection and try to retrieve the JSON data in a string in Android Studio, I end up getting no exceptions and an almost empty string (I think I am getting the 1st, 2nd, 3rd, and last character, but not too sure). I have included parts of the code below, and
URL url = null;
HttpURLConnection urc = null;
try {
url = new URL(query);
urc = (HttpURLConnection) url.openConnection();
InputStream in = new BufferedInputStream(urc.getInputStream());
jsoncontent = readStream(in);
System.out.println(jsoncontent);
} catch (MalformedURLException e) {
e.printStackTrace();
}
catch (IOException e) {
e.printStackTrace();
}
finally {
urc.disconnect();
}
The code for readStream() is below
private static String readStream(InputStream is) throws IOException {
StringBuilder sb = new StringBuilder();
BufferedReader r = new BufferedReader(new InputStreamReader(is),1000);
for (String line = r.readLine(); line != null; line =r.readLine()){
sb.append(line);
}
is.close();
return sb.toString();
}
Here is an exact chunk from an assignment I did last semester:
URL u = new URL(url);
HttpURLConnection conn = (HttpURLConnection) u.openConnection();
conn.setRequestMethod("GET");
conn.setRequestProperty("Accept", "text/html");
BufferedReader in = new BufferedReader(new InputStreamReader(conn.getInputStream()));
JSONObject searchResults = new JSONObject(in.readLine());
...
conn.disconnect();
You seem to be missing setRequestMethod("GET") and setRequestProperty("Accept", "text/html") in your code. Hope this helps.

Categories