In my project i am verifying the links for few landing pages from an excel sheet using Selenium and Java. Below is the code i am using:
public static void verifyConnection(String linkUrl) throws IOException {
URL url = new URL(linkUrl);
HttpURLConnection urlConnection = (HttpURLConnection) url.openConnection();
try {
if (urlConnection.getResponseCode() == 200) {
System.out.println("resp 200");
} else {
System.out.println("resp error");
}
} catch (Exception e) {
e.printStackTrace();
} finally {
urlConnection.disconnect();
}
}
The problem here is, when the linkUrl is something like https://www.google.com/ the response works fine, but when i do the same for Spanish version of our site with links like https://es.google.com/ or https://espanol.google.com/ i get Connection timed out exception.
Could someone advise, what i am missing?
Note: The above google links may not be valid, it is just to explain my scenario.
Related
I have the below method to check an incoming url is valid or not and based on this i need to redirect to error page.
/*
* This method is used to validate the URL and return a response
*/
private static int isValidUrl(String qrUrl) {
int code = 0;
try {
URL url = new URL(qrUrl);
HttpURLConnection connection = (HttpURLConnection)url.openConnection();
connection.setRequestMethod("GET");
connection.connect();
code = connection.getResponseCode();
logger.info("Response code is {} for the url", code);
} catch(MalformedURLException e) {
logger.error("A MalformedURLException Occured in QR Code servlet during URL Validation", e);
} catch (IOException e) {
logger.error("An IOException Occured in QR Code servlet during URL Validation", e);
}
return code;
}
The below line of code is giving medium security vulnerability(Description:
This web server request could be used by an attacker to expose internal services and filesystem) during code scan. How can I fix it?
I was trying several days to get HTTPS connection right to connect to Yobit public API. I don't know what happen to my code. I have tried so many different examples but nothing works out on Yobit. Those codes and examples I have tried, they either give 411, 503 error or MalFormException:no protocol. Can anyone help me? I have very limited experience with HTTPS or web programming on Java. If any one can provide me solutions and references, I will really appreciate that.
public void buildHttpsConnection()
{
try {
URL url = new URL("https://yobit.net/api/3/info");
HttpsURLConnection con = (HttpsURLConnection)url.openConnection();
con.setRequestMethod("GET");
con.setRequestProperty("user-Agent", "Mozilla/5.0 (compatible; JAVA AWT)");
con.setRequestProperty("Accept-Language","en-US,en;q=0.5");
con.setDoOutput(true);
con.setUseCaches(false);
System.out.println(con.getResponseCode());
}
catch (Exception e)
{
e.printStackTrace();
}
}
Try to use "https://www.yobit.net/api/3/info" URL Instead of "https://yobit.net/api/3/info"
It will give you the same result. You can validate it from the browser Window.
Check below snippet.
try {
URL url = null;
try {
url = new URL("https://www.yobit.net/api/3/info");
} catch (MalformedURLException e1) {
e1.printStackTrace();
}
HttpsURLConnection con = (HttpsURLConnection)url.openConnection();
try {
con.setRequestMethod("GET");
} catch (ProtocolException e1) {
e1.printStackTrace();
}
con.setRequestProperty("user-Agent", "Mozilla/5.0 (compatible; JAVA AWT)");
con.setRequestProperty("Accept-Language","en-US,en;q=0.5");
con.setDoOutput(true);
con.setUseCaches(false);
con.connect();
try {
System.out.println(con.getResponseCode());
} catch (IOException e1) {
e1.printStackTrace();
}
}
catch (Exception e)
{
e.printStackTrace();
}
I'm building a project in which I want a method to make a simple http GET request in order to send two variables to an website via URL.
In a normal java project I would likely use java.net or apache and solve the issue in a matter of minutes. In JavaME, due to my lack of experience I'm not really being able to fulfill the task.
Basically what I want to do is having an url like google.com/index.php?v1=x&v=y
being able to do a get request in order to send those variables via URL.
Any tips?
Here's an example of how you could do something like that.
HttpConnection connection = null;
InputStream inputstream = null;
String url = null;
StringBuffer dataReceived = null;
url = "http://www.google.com/index.php?v1=x&v=y";
dataReceived = new StringBuffer();
try {
connection = (HttpConnection) Connector.open(url);
connection.setRequestMethod(HttpConnection.GET);
connection.setRequestProperty("Content-Type", "text/plain");
connection.setRequestProperty("Connection", "close");
if (connection.getResponseCode() == HttpConnection.HTTP_OK) {
inputstream = connection.openInputStream();
int ch;
while ((ch = inputstream.read()) != -1 ) {
dataReceived.append((char) ch);
}
} else {
// Connection not ok
}
} catch (Exception e) {
// Something went wrong
} finally {
if (inputstream != null) {
try {
inputstream.close();
} catch (Exception e) {
}
}
if (connection != null) {
try {
connection.close();
} catch (Exception e) {
}
}
}
Note: I didn't test this specific code. I just edited some code I had lying around from a previous project of mine, so you may need to fix a few errors.
I have written a test web crawler class that attempts to search Google, as shown:
public class WebCrawler {
String query;
public WebCrawler(String search)
{
query = search;
}
public void connect()
{
HttpURLConnection connection = null;
try
{
String url = "http://www.google.com/search?q=" + query;
URL search = new URL(url);
connection = (HttpURLConnection)search.openConnection();
connection.setRequestMethod("GET");
connection.setDoOutput(true);
connection.setDoInput(true);
connection.setUseCaches(false);
connection.setAllowUserInteraction(false);
connection.connect();
BufferedReader read = new BufferedReader(new InputStreamReader(connection.getInputStream()));
String line = null;
while((line = read.readLine())!=null)
{
System.out.println(line);
}
read.close();
}
catch(MalformedURLException e)
{
e.printStackTrace();
}
catch(ProtocolException e)
{
e.printStackTrace();
}
catch(IOException e)
{
e.printStackTrace();
}
finally
{
connection.disconnect();
}
}
}
When I try to run it with a test query "test" though, I get a HTTP response 403 error-- what am I missing? This is my first time doing any networking stuff with Java.
403 == forbidden, which makes sense because you're a robot trying to access a part of google that they don't want robots accessing. Google's robots.txt pretty clearly specifies that you shouldn't be scraping /search.
Google provides a search API which allows 100 queries per day. They provide libraries and examples of how to interface with it in most languages, including Java. More than that, you've gotta pay.
See, I have to check like 50+ URLs for validity, and I'm assuming that catching more than 50 exceptions is kind of over the top. Is there a way to check if a bunch of URLs are valid without wrapping it in a try catch to catch exceptions? Also, just fyi, in Android the class "UrlValidator" doesn't exist (but it does exist in the standard java), and there's UrlUtil.isValidUrl(String url) but that method seems to be pleased with whatever you throw at it as long as it contains http://... any suggestions?
This solution does catch exceptions, however others may find it useful and doesn't require any libraries.
public boolean URLIsReachable(String urlString)
{
try
{
URL url = new URL(urlString);
HttpURLConnection urlConnection = (HttpURLConnection) url
.openConnection();
responseCode = urlConnection.getResponseCode();
urlConnection.disconnect();
return responseCode != 200;
} catch (MalformedURLException e)
{
e.printStackTrace();
return false;
} catch (IOException e)
{
e.printStackTrace();
return false;
}
}