Authenticating to sharepoint with kerberos from a java HttpClient - java

I have a linux\java6 client that will authenticate to sharepoint2010 with NTLM and then send HTTP REST web services using Apache Commons HttpClient.
I can do this with NTLM , but I want to use the same REST API to access sharepoint 2010 that uses kerberos auth.
Any examples how to authenticate and send REST over HTTP with a kerberos sharepoint?
(preferably using HttpClient)
p.s.
I dont have access to sharepoint code, but i do have access to sharepoint admin configurations.
This is roughly how I authenticate with NTLM:
HttpClient httpClient = new HttpClient(new SimpleHttpConnectionManager(true));
AuthPolicy.registerAuthScheme(AuthPolicy.NTLM, JCIFS_NTLMScheme.class);
String localHostName = Inet4Address.getLocalHost().getHostName();
authscope = new AuthScope(uri.getHost(), AuthScope.ANY_PORT);
httpClient.getState().setCredentials(authscope,new NTCredentials(
getUsername(),getPassword(),localHostName,getDomain()));
// after the initial ntlm auth I can call my REST service with "httpClient.executeMethod"
int status = httpClient.executeMethod(new GetMethod(accessURI + "/sitecollection/info"));

Please confirm that your environment is correctly setup for Kerberos, this can be achieved by running kinit. If this fails you will need to ensure that your krb5.ini (windows) or krb5.conf (linux) are setup to point to your domain controller correctly.
Once you have confirmed that Kerberos is functional you can use the example code from HttpClient as pasted below.
Please note that there are many issues that can cause Kerberos to fail, such as time synchronisation, supported encryption types, trust relationships across domain forests and it's also worth ensuring that your client is on a seperate box to the server.
Here is the example code which is available in the HttpClient download, you will need to ensure your JAAS configuration and krb5.conf or ini are correct!
public class ClientKerberosAuthentication {
public static void main(String[] args) throws Exception {
System.setProperty("java.security.auth.login.config", "login.conf");
System.setProperty("java.security.krb5.conf", "krb5.conf");
System.setProperty("sun.security.krb5.debug", "true");
System.setProperty("javax.security.auth.useSubjectCredsOnly","false");
DefaultHttpClient httpclient = new DefaultHttpClient();
try {
httpclient.getAuthSchemes().register(AuthPolicy.SPNEGO, new SPNegoSchemeFactory());
Credentials use_jaas_creds = new Credentials() {
public String getPassword() {
return null;
}
public Principal getUserPrincipal() {
return null;
}
};
httpclient.getCredentialsProvider().setCredentials(
new AuthScope(null, -1, null),
use_jaas_creds);
HttpUriRequest request = new HttpGet("http://kerberoshost/");
HttpResponse response = httpclient.execute(request);
HttpEntity entity = response.getEntity();
System.out.println("----------------------------------------");
System.out.println(response.getStatusLine());
System.out.println("----------------------------------------");
if (entity != null) {
System.out.println(EntityUtils.toString(entity));
}
System.out.println("----------------------------------------");
// This ensures the connection gets released back to the manager
EntityUtils.consume(entity);
} finally {
// When HttpClient instance is no longer needed,
// shut down the connection manager to ensure
// immediate deallocation of all system resources
httpclient.getConnectionManager().shutdown();
}
}
}

Related

Adding SSLContext in CloseableHttpAsyncClient at Runtime

We have a generic application which delivers message to different POST endpoints. And we are using
CloseableHttpAsyncClient for this purpose. Its been built/initialized as follows,
private static CloseableHttpAsyncClient get() {
CloseableHttpAsyncClient lInstance;
IOReactorConfig ioReactorConfig = IOReactorConfig.custom()
.setIoThreadCount(100)
.setConnectTimeout(10000)
.setSoTimeout(10000).build();
ConnectingIOReactor ioReactor = null;
try {
ioReactor = new DefaultConnectingIOReactor(ioReactorConfig);
} catch (IOReactorException e) {
logger_.logIfEnabled(Level.ERROR, e);
}
PoolingNHttpClientConnectionManager connManager = new PoolingNHttpClientConnectionManager(ioReactor);
connManager.setDefaultMaxPerRoute(50);
connManager.setMaxTotal(5000);
connManager.closeIdleConnections(10000, TimeUnit.MILLISECONDS);
baseRequestConfig = RequestConfig.custom().setConnectTimeout(10000)
.setConnectionRequestTimeout(10000)
.setSocketTimeout(10000).build();
lInstance = HttpAsyncClients.custom().setDefaultRequestConfig(baseRequestConfig)
.setConnectionManager(connManager).build();
lInstance.start();
return lInstance;
}
This is prebuilt and initialized. As an when a new request arrives to our application, based on message, authentication type, a new postRequest is built httpPost = new HttpPost(builder.build());
After setting the required header, payload etc. exiting httpClient is used to send the request.
httpClient.execute(httpPost, httpContext, null);
Now, the question is based on the our new requirement to support client certificate based authentication. And since our current approach is to create httpClient in the beginning, the question is how to change the behaviour of httpClient to send client certificate to some endpoints and work as it is for other endpoints which doesn't require certificate to be send?
I know I can introduce SSLContext to CloseableHttpAsyncClient while creating, but at the time of creating I don't have any information that we have any endpoint which requires certificate based authentication. And we can have many endpoints which would be supporting client certificate and that would be known at runtime.

Java Apache HTTP 401 response returned with correct credentials

I'm trying to hit a REST API link using Apache HttpClient but I keep getting a 401 error returned. I can login when I go to the URL in browser, after being prompted for a password. The code I'm using is below:
CredentialsProvider provider = new BasicCredentialsProvider();
UsernamePasswordCredentials credentials = new UsernamePasswordCredentials(creds.get(0), creds.get(1));
provider.setCredentials(AuthScope.ANY, credentials);
AuthCache authCache = new BasicAuthCache();
authCache.put(new HttpHost(uri.getHost(), uri.getPort(), "https"), new BasicScheme());
BasicHttpContext context = new BasicHttpContext();
context.setAttribute(ClientContext.CREDS_PROVIDER, provider);
context.setAttribute(ClientContext.AUTH_CACHE, authCache);
DefaultHttpClient client = new DefaultHttpClient();
client.setHttpRequestRetryHandler(new DefaultHttpRequestRetryHandler());
client.setCredentialsProvider(provider);
HttpResponse response = null;
try
{
// response = client.execute(new HttpGet(uri));
response = client.execute(new HttpGet(uri), context);
}
catch(IOException e)
{
logger.error("Error running authenticated get request: " + e);
}
I'm using HttpClient 4.2.3 and unfortunately I'm not able to upgrade this.
Any help would be appreciated! Thanks!
EDIT: turns out I need to supply the certificate, like using -cacert in curl, however I can't find an example of this!
Since you need to provide a certificate maybe this can help:
http://hc.apache.org/httpcomponents-client-4.2.x/httpclient/examples/org/apache/http/examples/client/ClientCustomSSL.java
I think that example complies with 4.2.3 .

A zip file download works with apache httpclient v4.3.6 but fails for other version

This is a little weird and I have done close to enough research finding the cause and resolution of this issue. My objective is to download a zip file from a secured URL that also requires login.
Everything works perfect when I use apache httpClient maven dependency of version 4.3.6. However, I cannot use this version due to the fact that my aws-sdk-java-core maven dependency also has httpclient dependency and using v4.3.6 makes the aws-sdk-java complains about a NoSuchMethod runtime exception. I understand this issue. The reason is that apache httpclient v4.3.6 dependency is more nearest in maven dependency tree than the version (4.5.1) used by aws-sdk-java-core dependency. Anyway, I will cut down more details on this because I am pretty sure I should make everything work with one version of a maven dependency and not use multiple version of the same jar.
Back to original question. As I cannot use v4.3.6, I told my code to use v4.5.1 and thats when file download code started giving problems. When I use httpclient v4.5.1, the response gives me following html content rather than giving me the zip file I have on the requested https url.
<html>
<HEAD><META HTTP-EQUIV='PRAGMA' CONTENT='NO-CACHE'><META HTTP-EQUIV='CACHE-
CONTROL' CONTENT='NO-CACHE'>
<TITLE>SAML 2.0 Auto-POST form</TITLE>
</HEAD>
<body onLoad="document.forms[0].submit()">
<NOSCRIPT>Your browser does not support JavaScript. Please click the
'Continue' button below to proceed. <br><br>
</NOSCRIPT>
<form action="https://githubext.deere.com/saml/consume" method="POST">
<input type="hidden" name="SAMLResponse" value="PFJlc3BvbnNlIHhtbG5zPSJ1cm46b2FzaXM6bmFtZXM6dGM6U0FNTDoyLjA6cHJvdG9jb2wiIERl">
<input type="hidden" name="RelayState" value="2F1HpzrUy5FdX">
<NOSCRIPT><INPUT TYPE="SUBMIT" VALUE="Continue"></NOSCRIPT>
</form>
</body>
</html>
When I use v4.3.6, the response gives me zip file as expected response. I have tried manually submitting this html content by adding more code but the response remains intact.
The original code I have for file download is provided below.
#Component
public class FileDAO {
public static void main(String args[]) throws Exception{
new FileDAO().loadFile("https://some_url.domain.com/zipball/master","myfile.zip");
}
public String loadFile(String url, String fileName) throws ClientProtocolException, IOException {
HttpClient client = login();
HttpResponse response = client.execute(new HttpGet(url));
int statusCode = response.getStatusLine().getStatusCode();
if (statusCode == 200) {
String unzipToFolderName = fileName.replace(".", "_");
FileOutputStream outputStream = new FileOutputStream(new File(fileName));
writeToFile(outputStream, response.getEntity().getContent());
return unzipToFolderName;
} else {
throw new RuntimeException("error downloading file, HTTP Status code: " + statusCode);
}
}
private void writeToFile(FileOutputStream outputStream, InputStream inputStream) {
try {
int read = 0;
byte[] bytes = new byte[1024];
while ((read = inputStream.read(bytes)) != -1) {
outputStream.write(bytes, 0, read);
}
} catch (Exception ex) {
throw new RuntimeException("error writing zip file, error message : " + ex.getMessage(), ex);
} finally {
try {
outputStream.close();
inputStream.close();
} catch (Exception ex) {}
}
}
private HttpClient login() throws IOException {
HttpClient client = getHttpClient();
HttpResponse response = client.execute(new HttpGet("https://some_url.domain.com"));
String responseBody = EntityUtils.toString(response.getEntity());
Document doc = Jsoup.parse(responseBody);
org.jsoup.select.Elements inputs = doc.getElementsByTag("input");
int statusCode = response.getStatusLine().getStatusCode();
if (statusCode == 200) {
HttpPost httpPost = new HttpPost("https://some_url.domain.com/saml/consume");
List<NameValuePair> data = new ArrayList<NameValuePair>();
data.add(new BasicNameValuePair("SAMLResponse", doc.select("input[name=SAMLResponse]").val()));
data.add(new BasicNameValuePair("RelayState", doc.select("input[name=RelayState]").val()));
httpPost.setEntity(new UrlEncodedFormEntity(data));
HttpResponse logingResponse = client.execute(httpPost);
int loginStatusCode = logingResponse.getStatusLine().getStatusCode();
if (loginStatusCode != 302) {
throw new RuntimeException("clone repo dao. error during login, HTTP Status code: " + loginStatusCode);
}
}
return client;
}
private HttpClient getHttpClient() {
CredentialsProvider provider = new BasicCredentialsProvider();
UsernamePasswordCredentials credentials = new UsernamePasswordCredentials("userId", "password");
provider.setCredentials(AuthScope.ANY, credentials);
return HttpClientBuilder.create().setDefaultCredentialsProvider(provider).build();
}
}
I am still analyzing what is going wrong with apache httpclient versions other than 4.3.6. Same code works with 4.3.6 but not with version above 4.3.6.
Any help is really appreciated. Thank you all.
Issue resolved. After going through apache httpclient documentations with serious debugging the logs, I could resolve this issue. I had to create two server logs, one for v4.3.6 and other for v4.5.2. I started comparing the server logs and found that the culprit was cookie type. Cookie type in old version was (automatically) configured as BEST_MATCH and it was working. However, for v4.5.2, the BEST_MATCH cookie type has been deprecated from apache. I have been trying with cookie settings after adding some more code but the cookie sent by server response was not matching with the DEFAULT cookie type that I had configured in the client code. It was resulting in cookie not being setup properly and that’s why the response was returning back SAML response (login page again) instead of zip file.
Apache cookie spec says this for the cookie specifications:
Default: Default cookie policy is a synthetic policy that picks up either RFC 2965, RFC 2109 or Netscape draft compliant implementation based on properties of cookies sent with the HTTP response (such as version attribute, now obsolete). This policy will be deprecated in favor of the standard (RFC 6265 compliant) implementation in the next minor release of HttpClient.
Standard strict: State management policy compliant with the syntax and semantics of the well-behaved profile defined by RFC 6265, section 4.
I updated cookie configuration to STANDARD_STRICT mode and everything started working with the latest version 4.5.2.
Here is the updated getHttpClient() method:
private CloseableHttpClient getHttpClient() {
CredentialsProvider provider = new BasicCredentialsProvider();
UsernamePasswordCredentials credentials = new UsernamePasswordCredentials(gitUserId, gitPassword);
provider.setCredentials(AuthScope.ANY, credentials);
RequestConfig config = RequestConfig.custom().setCookieSpec(CookieSpecs.STANDARD_STRICT).build();
return HttpClientBuilder.create().setDefaultCredentialsProvider(provider).setDefaultRequestConfig(config).setRedirectStrategy(new LaxRedirectStrategy()).build();
}

Java Proxy: How to extract Destination Host and Port from the HttpRequest?

I am working on a Http Proxy in java. Basically I have 3 applications:
a client application, where I just submit a request to a server VIA a proxy
a proxy that captures the request, modifies it and then forwards it to the web server
the web server
Here is my code for the Client (taken from the apache httpcore examples, but works well):
public class ClientExecuteProxy () {
public static void main(String[] args)throws Exception {
HttpHost proxy = new HttpHost("127.0.0.1", 8080, "http");
DefaultHttpClient httpclient = new DefaultHttpClient();
try {
httpclient.getParams().setParameter(ConnRoutePNames.DEFAULT_PROXY, proxy);
HttpHost target = new HttpHost("issues.apache.org", 443, "https");
HttpGet req = new HttpGet("/");
System.out.println("executing request to " + target + " via " + proxy);
HttpResponse rsp = httpclient.execute(target, req);
HttpEntity entity = rsp.getEntity();
System.out.println("----------------------------------------");
System.out.println(rsp.getStatusLine());
Header[] headers = rsp.getAllHeaders();
for (int i = 0; i<headers.length; i++) {
System.out.println(headers[i]);
}
System.out.println("----------------------------------------");
if (entity != null) {
System.out.println(EntityUtils.toString(entity));
}
} finally {
// When HttpClient instance is no longer needed,
// shut down the connection manager to ensure
// immediate deallocation of all system resources
httpclient.getConnectionManager().shutdown();
}
}
}
If I do a direct execution of the request to the server (if I comment the line "httpclient.getParams().setParameter(ConnRoutePNames.DEFAULT_PROXY, proxy);"), it works without any problem. But if I leave it like that, it will pass by the proxy. Here is the part that I do not know how to handle for the proxy:
The proxy listens for the requests, reads its content and verifies if it respects certain policies. If OK it will forward it to the server, else it will drop the request and it will send a HttpResponse with an error. The problem is when the request is OK and it needs to be forwarded. How does the proxy know to what address to forward it? My question is: How do I get the information from the request entered at the line "HttpHost target = new HttpHost("issues.apache.org", 443, "https");"?
I've googled for a couple of hours but found nothing. Can anybody help me please?
When you define an HTTP proxy to an application or browser, either:
There will be a preceding CONNECT request to form a tunnel, that tells you the target host:port, or
The entire target URL is placed in the middle of the GET/POST/... request line. Normally, without a proxy, this is just a relative URL, relative to the host:port of the TCP connection.

cURL app using Java

Does anyone have a good tutorial on how to write a java or javafx cURL application? I have seen tons of tutorials on how to initiate an external call to say like an XML file, but the XML feed I am trying to retrieve calls for you to submit the username and password before being able to retrieve the XML feed.
What are you trying to accomplish? Are you trying to retrieve a XML feed over HTTP?
In that case I suggest you to take a look at Apache HttpClient. It offers similair functionality as cURL but in a pure Java way (cURL is a native C application). HttpClient supports multiple authentication mechanisms. For example you can submit a username/password using Basic Authentication like this:
public static void main(String[] args) throws Exception {
DefaultHttpClient httpclient = new DefaultHttpClient();
httpclient.getCredentialsProvider().setCredentials(
new AuthScope("localhost", 443),
new UsernamePasswordCredentials("username", "password"));
HttpGet httpget = new HttpGet("https://localhost/protected");
System.out.println("executing request" + httpget.getRequestLine());
HttpResponse response = httpclient.execute(httpget);
HttpEntity entity = response.getEntity();
System.out.println("----------------------------------------");
System.out.println(response.getStatusLine());
if (entity != null) {
System.out.println("Response content length: " + entity.getContentLength());
}
if (entity != null) {
entity.consumeContent();
}
// When HttpClient instance is no longer needed,
// shut down the connection manager to ensure
// immediate deallocation of all system resources
httpclient.getConnectionManager().shutdown();
}
Check the website for more examples.

Categories