java.net.URL giving security vulnerability during code scan - java

I have the below method to check an incoming url is valid or not and based on this i need to redirect to error page.
/*
* This method is used to validate the URL and return a response
*/
private static int isValidUrl(String qrUrl) {
int code = 0;
try {
URL url = new URL(qrUrl);
HttpURLConnection connection = (HttpURLConnection)url.openConnection();
connection.setRequestMethod("GET");
connection.connect();
code = connection.getResponseCode();
logger.info("Response code is {} for the url", code);
} catch(MalformedURLException e) {
logger.error("A MalformedURLException Occured in QR Code servlet during URL Validation", e);
} catch (IOException e) {
logger.error("An IOException Occured in QR Code servlet during URL Validation", e);
}
return code;
}
The below line of code is giving medium security vulnerability(Description:
This web server request could be used by an attacker to expose internal services and filesystem) during code scan. How can I fix it?

Related

HttpURLConnection not working for multilingual site

In my project i am verifying the links for few landing pages from an excel sheet using Selenium and Java. Below is the code i am using:
public static void verifyConnection(String linkUrl) throws IOException {
URL url = new URL(linkUrl);
HttpURLConnection urlConnection = (HttpURLConnection) url.openConnection();
try {
if (urlConnection.getResponseCode() == 200) {
System.out.println("resp 200");
} else {
System.out.println("resp error");
}
} catch (Exception e) {
e.printStackTrace();
} finally {
urlConnection.disconnect();
}
}
The problem here is, when the linkUrl is something like https://www.google.com/ the response works fine, but when i do the same for Spanish version of our site with links like https://es.google.com/ or https://espanol.google.com/ i get Connection timed out exception.
Could someone advise, what i am missing?
Note: The above google links may not be valid, it is just to explain my scenario.

DELETE Request in Android doesn't connect to server

So my question is how can I create a DELETE Request to an URL in Android Studio Java. I already have an Async Task which GET json from URL. So my question now is how can I create a DELETE request
EDIT:
So right now I got this code:
int pos = arrlist.get(info.position).getId();
URL_DELETE = "http://testserver/test/tesst.php?id=" + pos + "&username=" + username + "&password=" + password;
URL url = null;
try {
url = new URL(URL_DELETE);
HttpURLConnection httpCon = (HttpURLConnection) url.openConnection();
httpCon.setDoOutput(true);
httpCon.setRequestProperty(
"Content-Type", "application/x-www-form-urlencoded" );
httpCon.setRequestMethod("DELETE");
} catch (MalformedURLException e) {
e.printStackTrace();
} catch (ProtocolException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
To understand the content of the given URL should be deleted. But if I run the code nothing happens.
You need to call connect() on the HttpURLConnection. Right now you're not actually making a connection to the server.
Based on your comments on the other answer, you're also trying to run this code on the main (UI) thread - you'll need to change your code to run on a background thread.
If you're using OkHttp:
Request request = new Request.Builder().delete().url(url).build();
Response rawResponse = null;
try {
rawResponse = new OkHttpClient().newCall(request).execute();
} catch (IOException e) {
System.err.println(e.getMessage());
}
String responseAsString = rawResponse.body().string();

How to call the github API form HttpsURLConnection in java

I need to change my branch named testingProtectedBranch1 as a protected branch with providing the following parameters
on
required_status_check : include_admins= true, strict= true, context= continuous-integration/travis-ci
restrictions: null
required_pull_request_reviews: include_admins=false
here is my code and the access token ( the variable token ) is provided by the user at the runtime.
public void setMasterBranchAsProtected() throws Exception{
String URLForCallingTheBranchAPI="https://api.github.com/repos/kasunsiyambalapitiya/testingProtectedBranch1/branches/master/protection";
String jsonInput="{\"required_status_checks\":{\"include_admins\":true,\"strict\":true,\"contexts\":[\"continuous-integration/travis-ci\"]},"
+ "\"restrictions\":null,"
+ "\"required_pull_request_reviews\":{\"include_admins\":false} ";
try {
URL urlObject= new URL(URLForCallingTheBranchAPI);
HttpsURLConnection httpsURLCon= (HttpsURLConnection)urlObject.openConnection();
httpsURLCon.setDoOutput(true);
httpsURLCon.setRequestMethod("PUT");
httpsURLCon.setRequestProperty("User-Agent", "Mozilla/5.0");
httpsURLCon.setRequestProperty("Accept","application/vnd.github.loki-preview+json");
httpsURLCon.setRequestProperty("Authorization", "Bearer "+token);
OutputStreamWriter outputStream= new OutputStreamWriter(httpsURLCon.getOutputStream());
outputStream.write(jsonInput);
int responseCode= httpsURLCon.getResponseCode();
outputStream.flush();
outputStream.close();
} catch (MalformedURLException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
but for the response code I receive 422 which resembles unprocessable entity. What I am doing wrong in here, please help me to figure this out. Thanks in advance.

What is required for a successful php session?

I'm writing a client j2me app that connects to a php based api using a post method. For security purposes the app and the api should interact within a session with a 30 minutes timeout. My problem is that, the application user has to keep on logging in even when the session time out is not yet done. My php code is fine because I've tested it on a browser and it works fine; however but the application fails and I have to keep on logging in. Might I be missing something? These are the headers am sending to the server using my Java app.
String phonenumber = this.phonenumberTextbox.getString();
String password = this.passwordTextBox.getString();
HttpConnection httpConn = null;
String url = "http://192.168.56.1/?action=login-user";
InputStream is;
OutputStream os;
is = null;
os = null;
String sReply = "";
boolean loggedIn = false;
try {
// Open an HTTP Connection object
httpConn = (HttpConnection) Connector.open(url);
// Setup HTTP Request to POST
httpConn.setRequestMethod(HttpConnection.POST);
httpConn.setRequestProperty("User-Agent", "Profile/MIDP-1.0 Confirguration/CLDC-1.0");
httpConn.setRequestProperty("Accept_Language", "en-US");
//Content-Type is must to pass parameters in POST Request
httpConn.setRequestProperty("Content-Type", "application/x-www-form-urlencoded");
os = httpConn.openOutputStream();
String params;
params = "phonenumber=" + phonenumber + "&password=" + password;
os.write(params.getBytes());
// Read Response from the Server
StringBuffer sb = new StringBuffer();
is = httpConn.openDataInputStream();
int chr;
while ((chr = is.read()) != -1) {
sb.append((char) chr);
}
// Web Server just returns stuff
sReply = sb.toString();
} catch (IOException ex) {
System.out.println("Cannot find server");
} finally {
if (is != null) {
try {
is.close();
} catch (IOException ex) {
}
}
if (os != null) {
try {
os.close();
} catch (IOException ex) {
}
}
if (httpConn != null) {
try {
httpConn.close();
} catch (IOException ex) {
}
}
}
// do something with sReply
Default practice in PHP these days** when dealing for sessions if for PHP to generate a cookie for the client to store. Your java client must store this cookie (called phpsessid by default, but any PHP programmer worth their salt will change the cookie name), and present it in the HTTP headers on subsequent requests.
I'm not very familiar with the HTTP support in Java, but if it's anything like curl it will include options for setting and retrieving cookies.
** It used to be possible to pass session tokens by a query string (a GET parameter) when making the request, but as this was horribly insecure and leaked potentially sensitive information, it's never used any more and may even no longer be available as an option.

HTTP response 403 when program tries to initiate connection to Google?

I have written a test web crawler class that attempts to search Google, as shown:
public class WebCrawler {
String query;
public WebCrawler(String search)
{
query = search;
}
public void connect()
{
HttpURLConnection connection = null;
try
{
String url = "http://www.google.com/search?q=" + query;
URL search = new URL(url);
connection = (HttpURLConnection)search.openConnection();
connection.setRequestMethod("GET");
connection.setDoOutput(true);
connection.setDoInput(true);
connection.setUseCaches(false);
connection.setAllowUserInteraction(false);
connection.connect();
BufferedReader read = new BufferedReader(new InputStreamReader(connection.getInputStream()));
String line = null;
while((line = read.readLine())!=null)
{
System.out.println(line);
}
read.close();
}
catch(MalformedURLException e)
{
e.printStackTrace();
}
catch(ProtocolException e)
{
e.printStackTrace();
}
catch(IOException e)
{
e.printStackTrace();
}
finally
{
connection.disconnect();
}
}
}
When I try to run it with a test query "test" though, I get a HTTP response 403 error-- what am I missing? This is my first time doing any networking stuff with Java.
403 == forbidden, which makes sense because you're a robot trying to access a part of google that they don't want robots accessing. Google's robots.txt pretty clearly specifies that you shouldn't be scraping /search.
Google provides a search API which allows 100 queries per day. They provide libraries and examples of how to interface with it in most languages, including Java. More than that, you've gotta pay.

Categories