Java file upload to windows shared folder with authentication - java

I'm trying to upload some files on a shared folder where I'm granted on with full control, but I see that connection is not working for some authentication reasons. This is the piece of code I've used for some writing tests:
String destination = "serverX/shareFolder/";
String domain = "myDomain";
String smbFile = "smb://"+domain+"/user1:pwd1#"+destination;
SmbFile sFile = new SmbFile(smbFile);
SmbFileOutputStream sfos = new SmbFileOutputStream(sFile);
sfos.write("Test".getBytes());
sfos.close();
this is the error I've received:
jcifs.smb.SmbAuthException: The referenced account is currently locked out and may not be logged on to.

Is looks like the code is good, the problem seems to be the account you are using. Have you tried other account? or maybe you can "unlock" the account someway at the server.

Related

Drive API v3 Java/Android - list files sharedWithMe

I need my app to upload files, on the GDrive account, that can be listed and read by other user accounts (same app, other devices)
I am uploading files from aUserFileCreator#gmail.com and set the permission to anyone/reader + make it public with allowFileDiscovery
File file = driveService.files()
.create(fileMetadata, mediaContent)
.setFields("id")
.setSupportsAllDrives(true)
.setIgnoreDefaultVisibility(true)
.execute();
Permission p = new Permission();
p.setType("anyone");
p.setRole("reader");
p.setAllowFileDiscovery(true);
driveService.permissions()
.create(file.getId(), p)
.execute();
Share the file later with TheUserToListAndRead#gmail.com
Permission accessPermission = new Permission();
accessPermission.setEmailAddress("TheUserToListAndRead#gmail.com");
accessPermission.setType("user");
accessPermission.setRole("reader");
driveService.permissions().create(fileId, accessPermission).execute();
When trying to list the files, like below, I am not getting anything back
but the files are visible in the Drive app of TheUserToListAndRead#gmail.com
FileList result = driveService.files().list()
.setQ("not 'me' in owners")
.setIncludeItemsFromAllDrives(true)
.setSupportsAllDrives(true)
.setSpaces("drive,appDataFolder")
.setCorpora("allDrives")
.execute();
Alternatively I used setQ("sharedWithMe") with no success
The code works for files in the readers Drive account (only created and owned by the reader) when I remove setQ completely or set it to a mime-type of some sort
First let me say I am NOT an android developer, i do however know the google api java client library. I do not know if this works in andorid or not and would love to hear if it does.
What you are looking for is something called a service account. Service accounts are like dummy users. If you share a directory with a service account then the service account will have access to this directory without having to authorize a user to access it.
GoogleCredential credential = GoogleCredential.fromStream(new
FileInputStream("MyProject-1234.json"))
.createScoped(Collections.singleton(SQLAdminScopes.SQLSERVICE_ADMIN));
How to create service account credetials. just remember to enable the google drive api under library.

Graph API in SSO is not working in Azure AD

I am trying to develop a Java web application with SSO by following this azure tutorial. I created an account in Azure and created an AD. Developed and deployed the code in Tomcat. When I try to access the page, I am getting the following error
Exception - java.io.IOException: Server returned HTTP response code: 403 for URL: https://graph.windows.net/ppceses.onmicrosoft.com/users?api-version=2013-04-05
I do not find enough answers for this error. I changed the api-version to 1.6. Even then it did not work.
MORE ANALYSIS:
After troubleshooting, I found out that the logged-in user info is fetched and is available in Sessions object. It errors out when its trying to read the response and convert into the String object. Following is the calling method where it errors out.
HttpClientHelper.getResponseStringFromConn(conn, true);
Actual method to write the response into String:
public static String getResponseStringFromConn(HttpURLConnection conn, boolean isSuccess) throws IOException {
BufferedReader reader = null;
if (isSuccess) {
reader = new BufferedReader(new InputStreamReader(conn.getInputStream()));
} else {
reader = new BufferedReader(new InputStreamReader(conn.getErrorStream()));
}
StringBuffer stringBuffer = new StringBuffer();
String line = "";
while ((line = reader.readLine()) != null) {
stringBuffer.append(line);
}
return stringBuffer.toString();
}
The actual issue is on the Graphic API where we try to read the response in the String format.
#Anand, According to Microsoft Graph error responses and resource types, the response code 403 means Forbidden below.
Access is denied to the requested resource. The user might not have enough permission.
Please move to the CONFIGURE tab of your application registered in your AAD domain on Azure classic portal, then check whether enable enough permission, please see the figure below.
I got the same error, been struggling with it a few days. What I noticed was that even if I checked ALL permissions for Windows Azure Active Directory I still got the 403. So, I deleted the app in App Registrations and created it again from scratch, generated new application key and readded reply urls. In Required Permissions/Windows Azure Active Directory check:
Sign in and read user profile
Access the directory as the signed-in user
I can now call me/memberOf successfully.
Hope it helps.
The below worked for me.
At the active directory app registrations -> app ->settings->permissions-> enable delegate permissions to read directory data. Save and close the blade. Also Click Grant Permissions and close the blade.
Once the above is done, Log out and Log in back with a fresh token to the application. (Guess the token with prior authorizations will not reflect the latest permission changes and hence the re-login may have worked in my case)

Accessing kerberos secured WebHDFS without SPnego

I have a working application for managing HDFS using WebHDFS.
I need to be able to do this on a Kerberos secured cluster.
The problem is, that there is no library or extension to negotiate the ticket for my app, I only have a basic HTTP client.
Would it be possible to create a Java service which would handle the ticket exchange and once it gets the Service ticket to just pass it to the app for use in a HTTP request?
In other words, my app would ask the Java service to negotiate the tickets and it would return the Service ticket back to my app in a string or raw string and the app would just attach it to the HTTP request?
EDIT: Is there a similar elegant solution like #SamsonScharfrichter described for HTTPfs? (To my knowledge, it does not support delegation tokens)
EDIT2: Hi guys, I am still completly lost. Im trying to figure out the Hadoop-auth client without any luck. Could you please help me out again? I already spent hours reading upon it without luck.
The examples say to do this:
* // establishing an initial connection
*
* URL url = new URL("http://foo:8080/bar");
* AuthenticatedURL.Token token = new AuthenticatedURL.Token();
* AuthenticatedURL aUrl = new AuthenticatedURL();
* HttpURLConnection conn = new AuthenticatedURL(url, token).openConnection();
* ....
* // use the 'conn' instance
* ....
Im lost already here. What initial connection do I need? How can
new AuthenticatedURL(url, token).openConnection();
take two parameters? there is no constructor for such a case. (im getting error because of this). Shouldnt a principal be somewhere specified? It is probably not going to be this easy.
URL url = new URL("http://<host>:14000/webhdfs/v1/?op=liststatus");
AuthenticatedURL.Token token = new AuthenticatedURL.Token();
HttpURLConnection conn = new AuthenticatedURL(url, token).openConnection(url, token);
Using Java code plus the Hadoop Java API to open a Kerberized session, get the Delegation Token for the session, and pass that Token to the other app -- as suggested by #tellisnz -- has a drawback: the Java API requires quite a lot of dependencies (i.e. a lot of JARs, plus Hadoop native libraries). If you run you app on Windows, in particular, it will be a tough ride.
Another option is to use Java code plus WebHDFS to run a single SPNEGOed query and GET the Delegation Token, then pass it to the other app -- that option requires absolutely no Hadoop library on your server. The barebones version would be sthg like
URL urlGetToken = new URL("http://<host>:<port>/webhdfs/v1/?op=GETDELEGATIONTOKEN") ;
HttpURLConnection cnxGetToken =(HttpURLConnection) urlGetToken.openConnection() ;
BufferedReader httpMessage = new BufferedReader( new InputStreamReader(cnxGetToken.getInputStream()), 1024) ;
Pattern regexHasToken =Pattern.compile("urlString[\": ]+(.[^\" ]+)") ;
String httpMessageLine ;
while ( (httpMessageLine =httpMessage.readLine()) != null)
{ Matcher regexToken =regexHasToken.matcher(httpMessageLine) ;
if (regexToken.find())
{ System.out.println("Use that template: http://<Host>:<Port>/webhdfs/v1%AbsPath%?delegation=" +regexToken.group(1) +"&op=...") ; }
}
httpMessage.close() ;
That's what I use to access HDFS from a Windows Powershell script (or even an Excel macro). Caveat: with Windows you have to create your Kerberos TGT on the fly, by passing to the JVM a JAAS config pointing to the appropriate keytab file. But that caveat also applies to the Java API, anyway.
You could take a look at the hadoop-auth client and create a service which does the first connection, then you might be able to grab the 'Authorization' and 'X-Hadoop-Delegation-Token' headers and cookie from it and add it to your basic client's requests.
First you'll need to have either used kinit to authenticate your user for application before running. Otherwise, you're going to have to do a JAAS login for your user, this tutorial provides a pretty good overview on how to do that.
Then, to do the login to WebHDFS/HttpFS, we'll need to do something like:
URL url = new URL("http://youhost:8080/your-kerberised-resource");
AuthenticatedURL.Token token = new AuthenticatedURL.Token();
HttpURLConnection conn = new AuthenticatedURL().openConnection(url, token);
String authorizationTokenString = conn.getRequestProperty("Authorization");
String delegationToken = conn.getRequestProperty("X-Hadoop-Delegation-Token");
...
// do what you have to to get your basic client connection
...
myBasicClientConnection.setRequestProperty("Authorization", authorizationTokenString);
myBasicClientConnection.setRequestProperty("Cookie", "hadoop.auth=" + token.toString());
myBasicClientConnection.setRequestProperty("X-Hadoop-Delegation-Token", delegationToken);

how to use common vfs to read excel from http server which displays authentication pop up in the browser?

I am new to common vfs. I am trying to read an excel sheet located on http://starpoint.com/...
Whenever i use this url on browser, it gives me authentication required pop up.I tried using the above code but it didn't work.
StaticUserAuthenticator auth = new StaticUserAuthenticator("domain", "username", "password");
FileSystemOptions opts = new FileSystemOptions();
DefaultFileSystemConfigBuilder.getInstance().setUserAuthenticator(opts, auth);
FileObject fo = VFS.getManager().resolveFile("htttp://starpoint.com/../", opts);
I also tried entering username and password in the url by this format http://username:password#starpoint.com/..../ but it does not work.
One more thing if my username is xyz\john then will xyz be my domain in the first line of the code? Any other way will also be appreciated.

Create file on another server

How would I create a file in another windows server? The server has a username, and password, ip address and specific directory.
SAMBA! Braziiillll, Braziiiiiiillll!
Something like this:
String userPass = "username:password";
String filePath = "smb://ip_address/shared_folder/file_name";
NtlmPasswordAuthentication authentication = new NtlmPasswordAuthentication(userPass);
SmbFile smbFile = new SmbFile(filePath, authentication);
SmbFileOutputStream smbFileOutputStream = new SmbFileOutputStream(smbFile);
PrintStream printStream = new PrintStream(smbFileOutputStream);
//You should be good from this point on...
NOTE: The destination folder needs to be shared first!
As #orm already points out, this is answered already here FTP upload via sockets
Basically, you could reuse an existing library like Apache Commons Net to do it. For the specifics of using the FTP client have a look at the documentation for the class FTPClient class.

Categories