Google Storage:: Clicking on "Public Link" downloads the files - java

I am referring below Github link,
https://github.com/GoogleCloudPlatform/java-docs-samples/blob/master/storage/json-api/src/main/java/StorageSample.java
Code Snippet
public static void uploadFile(String bucketName, String targetPath, String filePath) throws Exception {
Storage storage = getStorage();
StorageObject object = new StorageObject();
object.setBucket(bucketName);
File file = new File(filePath);
InputStream stream = new FileInputStream(file);
try {
// String contentType =
// URLConnection.guessContentTypeFromStream(stream);
InputStreamContent content = new InputStreamContent("image/jpeg", stream);
Storage.Objects.Insert insert = storage.objects().insert(bucketName, null, content);
insert.setName(targetPath + file.getName());
insert.execute();
} finally {
stream.close();
}
}
public static void uploadFile(String name, String targetPath, String contentType, File file, String bucketName)
throws IOException, GeneralSecurityException, Exception {
InputStreamContent contentStream = new InputStreamContent(contentType, new FileInputStream(file));
contentStream.setLength(file.length());
StorageObject objectMetadata = new StorageObject().setName(targetPath + name)
.setAcl(Arrays.asList(new ObjectAccessControl().setEntity("allUsers").setRole("READER")));
Storage client = getStorage();
Storage.Objects.Insert insertRequest = client.objects().insert(bucketName, objectMetadata, contentStream);
insertRequest.execute();
}
private static Storage getStorage() throws Exception {
if (storage == null) {
HttpTransport httpTransport = new NetHttpTransport();
JsonFactory jsonFactory = new JacksonFactory();
List<String> scopes = new ArrayList<String>();
scopes.add(StorageScopes.DEVSTORAGE_FULL_CONTROL);
// Collection<String> scopes = StorageScopes.all();
Credential credential = new GoogleCredential.Builder().setTransport(httpTransport)
.setJsonFactory(jsonFactory).setServiceAccountId(getProperties().getProperty(ACCOUNT_ID_PROPERTY))
.setServiceAccountPrivateKeyFromP12File(
new File(getProperties().getProperty(PRIVATE_KEY_PATH_PROPERTY)))
.setServiceAccountScopes(scopes).build();
storage = new Storage.Builder(httpTransport, jsonFactory, credential)
.setApplicationName(getProperties().getProperty(APPLICATION_NAME_PROPERTY)).build();
}
return storage;
}
public static void main(String[] args) throws Exception {
// CloudStorage.createBucket("my-bucket3/good");
CloudStorage.uploadFile("my-static", "temp/",
"/Users/rupanjan/Downloads/15676285_10158033346085311_1317913818452680683_o.jpg");
CloudStorage.uploadFile("15676285_10158033346085311_1317913818452680683_o.jpg", "temp/", "image/jpeg",
new File("/Users/rupanjan/Downloads/15676285_10158033346085311_1317913818452680683_o.jpg"),
"my-static");
// CloudStorage.downloadFile("my-bucket", "some-file.jpg",
// "/var/downloads");
List<String> buckets = CloudStorage.listBuckets();
System.out.println(buckets.size());
}
The issue I am facing,
I am able to upload the file successfully, but whenever, I click on that "Public Link", it downloads automatically. My intention was to share it for all user with read access.
N.B. If I am uploading the file manually from browser, I am able to open the file in browser, but when I upload it programically, it downloads everytime I click on "Public Link"
Please correct me if I am missing anything!!

It sounds like you know the name of the bucket and the object, and that the objects are all publicly readable, and that you want a URL that you can share that will allow others to read the object.
There is no need to use the "public link" functionality in the console for this. Public URLs can be constructed programmatically. They follow this pattern:
https://storage.googleapis.com/bucket_name/object_name

Related

Getting java.net.ConnectException: Connection refused (Connection refused) while creating object using fake-gcs-server image

I am getting java.net.ConnectException:connection refused error while trying to write junit for creating object in gcs using fake-gcs-server image. Please find the below code. Bucket name is test and consider, it is already created.
#TempDir
private static File directory;
private static final GenericContainer<?> GCS_CONTAINER = new GenericContainer<>(DockerImageName.parse("fsouza/fake-gcs-server:1.33.1"))
.withExposedPorts(4443).withCreateContainerCmdModifier(cmd -> cmd.withEntrypoint(
"/bin/fake-gcs-server",
"-scheme", "http"
));
String fakeGcsExternalUrl = "http://" + GCS_CONTAINER.getContainerIpAddress() + ":" + GCS_CONTAINER.getFirstMappedPort();
private static final Storage storage = new Storage.Builder(getTransport(), GsonFactory.getDefaultInstance(), null).setRootUrl(fakeGcsExternalUrl).setApplicationName("test").build();
void test() {
final File localFile1 = new File(directory.getAbsolutePath() + File.separator + "testFile.txt");
localFile1.createNewFile();
try (final FileWriter fileWriter = new FileWriter(localFile1.getPath())) {
fileWriter.write("Test gs file content");
}
final InputStream stream = FileUtils.openInputStream(localFile1);
Path path = localFile1.toPath();
String contentType = Files.probeContentType(path);
uploadFile("test", "/sampleFiles/newFile.txt", contentType, stream, null);
}
public String uploadFile(final Storage storage, final String bucketName, final String filePath,
final String contentType, final InputStream inputStream, final Map<String, String> metadata)
throws IOException {
final InputStreamContent contentStream = new InputStreamContent(contentType, inputStream);
final StorageObject objectMetadata = new StorageObject().setName(filePath);
objectMetadata.setMetadata(GoogleLabels.manageLabels(metadata));
final Storage.Objects.Insert insertRequest = storage.objects().insert(bucketName, objectMetadata,
contentStream);
return insertRequest.execute().getName();
}
This problem might be related to you omitting to set the fake-gcs-server's external URL property to the container's address. Make sure you follow guide in the official repo https://github.com/fsouza/fake-gcs-server/blob/cf3fcb083e19553636419818e29f84825bd1e13c/examples/java/README.md, particularly, that you execute the following code:
private static void updateExternalUrlWithContainerUrl(String fakeGcsExternalUrl) throws Exception {
String modifyExternalUrlRequestUri = fakeGcsExternalUrl + "/_internal/config";
String updateExternalUrlJson = "{"
+ "\"externalUrl\": \"" + fakeGcsExternalUrl + "\""
+ "}";
HttpRequest req = HttpRequest.newBuilder()
.uri(URI.create(modifyExternalUrlRequestUri))
.header("Content-Type", "application/json")
.PUT(BodyPublishers.ofString(updateExternalUrlJson))
.build();
HttpResponse<Void> response = HttpClient.newBuilder().build()
.send(req, BodyHandlers.discarding());
if (response.statusCode() != 200) {
throw new RuntimeException(
"error updating fake-gcs-server with external url, response status code " + response.statusCode() + " != 200");
}
}
before using the container.

Error while uploading huge base64 file to S3 Bucket

I am trying to upload a huge video file to the s3 bucket.
I am getting the data from the client-side in base64 format which I am sending to the S3client to upload to my bucket as shown below:-
public class UploadFileService {
private static final String BUCKET_NAME = "data";
private static final Regions REGION = Regions.US_EAST_2;
LoggerUtils loggerUtils = new LoggerUtils ();
public String uploadFile(String fileData, String fileName, String contentType, String extension){
try {
loggerUtils.log ("File Data" , fileData);
byte[] bI = org.apache.commons.codec.binary.Base64.decodeBase64 ((fileData.substring (fileData.indexOf (",") + 1)).getBytes ());
InputStream fis = new ByteArrayInputStream (bI);
AmazonS3 s3 = new AmazonS3Client ();
Region usWest02 = Region.getRegion (REGION);
s3.setRegion (usWest02);
ObjectMetadata metadata = new ObjectMetadata ();
metadata.setContentLength (bI.length);
metadata.setContentType ("video/mp4");
//metadata.setContentType (contentType + "/" + extension.substring (1));
metadata.setCacheControl ("public, max-age=0");
s3.putObject (BUCKET_NAME, fileName, fis, metadata);
s3.setObjectAcl (BUCKET_NAME, fileName, CannedAccessControlList.PublicRead);
URL s3Url = s3.getUrl(BUCKET_NAME, fileName);
return s3Url.toExternalForm();
}
catch (Exception exception){
loggerUtils.log (exception.toString ());
throw exception;
}
}
public static void main(String[] args) {
String fileName = "abc1.mp4";
String fileData = "hkbk";
new UploadFileService ().uploadFile (fileData, fileName, null, null);
}
}
But if the fileData is usage (base64 of a 2MB video) then I am getting the below error:-
Error:(46, 27) java: constant string too long

Error with CognitiveJ/ExampleCode

I would like to use CognitiveJ (GitHub from CognitiveJ) but all I get is:
Status:401; Body: {"error":{"code":"Unspecified","message":"Access denied due to invalid subscription key. Make sure you are subscribed to an API you are trying to call and provide the right key."}}
Here is the Code:
public static String lic1 = "xxx";
public static String lic2 = "xxx";
public static void main(String[] args) throws IOException {
new Bildkontrolle();
}
public Bildkontrolle() throws IOException {
File imageFile = new File("E:\\DSC00306.jpg");
new FaceRecognicion(lic1, lic2, imageFile);
}
And here the second class:
public FaceRecognicion(String lic1, String lic2, File imageFile) throws IOException {
BufferedImage bufImage = ImageIO.read(imageFile);
InputStream inpStream = new FileInputStream(imageFile);
FaceScenarios faceScenarios = new FaceScenarios(lic1,
lic1);
ImageOverlayBuilder imageOverlayBuilder = ImageOverlayBuilder.builder(bufImage);
imageOverlayBuilder.outlineFacesOnImage(faceScenarios.findFaces(inpStream), RectangleType.FULL,
CognitiveJColourPalette.STRAWBERRY).launchViewer();
}
Does anyone have an examplecode where I can look up how to use the API.
I stuck at the point where to send the Request.

Java Google Cloud Storage upload media link null, but image uploads

I'm trying to upload an image to a existing bucket in my Google Cloud Storage.
The image file gets uploaded successfully when I go and check, but the returned download url is null
CODE
private String uploadImage(File filePath, String blobName, File uploadCreds) throws FileNotFoundException, IOException{
Storage storage = StorageOptions.newBuilder().setProjectId("myProjectId")
.setCredentials(ServiceAccountCredentials.fromStream(new FileInputStream(uploadCreds)))
.build()
.getService();
String bucketName = "myBucketName";
Bucket bucket = storage.get(bucketName);
BlobId blobId = BlobId.of(bucket.getName(), blobName);
InputStream inputStream = new FileInputStream(filePath);
BlobInfo blobInfo = BlobInfo.newBuilder(blobId).setContentType("image/jpeg").build();
try (WriteChannel writer = storage.writer(blobInfo)) {
byte[] buffer = new byte[1024];
int limit;
try {
while ((limit = inputStream.read(buffer)) >= 0) {
writer.write(ByteBuffer.wrap(buffer, 0, limit));
}
} catch (Exception ex) {
ex.printStackTrace();
}finally {
writer.close();
}
System.out.println("Image URL : " + blobInfo.getMediaLink());
System.out.println("Blob URL : " + blobInfo.getSelfLink());
return blobInfo.getMediaLink();
}
}
filePath is the Image File
blobName is a random Image Name
uploadCreds is my credintials.json file
Why is the blobInfo.getMediaLink() and blobInfo.getSelfLink() returning null? What am i doing wrong?
Here is my code that works perfectly
#RestController
#RequestMapping("/api")
public class CloudStorageHelper {
Credentials credentials = GoogleCredentials.fromStream(new FileInputStream("C:\\Users\\sachinthah\\Downloads\\MCQ project -1f959c1fc3a4.json"));
Storage storage = StorageOptions.newBuilder().setCredentials(credentials).build().getService();
public CloudStorageHelper() throws IOException {
}
#SuppressWarnings("deprecation")
#RequestMapping(method = RequestMethod.POST, value = "/imageUpload112")
public String uploadFile(#RequestParam("fileseee")MultipartFile fileStream)
throws IOException, ServletException {
BlobstoreService blobstoreService = BlobstoreServiceFactory.getBlobstoreService();
String bucketName = "mcqimages";
checkFileExtension(fileStream.getName());
DateTimeFormatter dtf = DateTimeFormat.forPattern("-YYYY-MM-dd-HHmmssSSS");
DateTime dt = DateTime.now(DateTimeZone.UTC);
String fileName = fileStream.getOriginalFilename();
BlobInfo blobInfo = BlobInfo.newBuilder(bucketName, fileName)
.setAcl(new ArrayList<>(Arrays.asList(Acl.of(User.ofAllUsers(), Role.READER))))
.build(),
fileStream.getInputStream());
System.out.println(blobInfo.getMediaLink());
// sachintha added a comma after the link to identify the link that get generated
return blobInfo.getMediaLink() + ",";
}
private void checkFileExtension(String fileName) throws ServletException {
if (fileName != null && !fileName.isEmpty() && fileName.contains(".")) {
String[] allowedExt = {".jpg", ".jpeg", ".png", ".gif"};
for (String ext : allowedExt) {
if (fileName.endsWith(ext)) {
return;
}
}
throw new ServletException("file must be an image");
}
}
The Answer was quite simple, i just got rid of the manual upload method and used the inbuilt create blob.
private String uploadImage(File filePath, String blobName, File uploadCreds) throws FileNotFoundException, IOException{
Storage storage = StorageOptions.newBuilder().setProjectId("porjectId")
.setCredentials(ServiceAccountCredentials.fromStream(new FileInputStream(uploadCreds)))
.build()
.getService();
String bucketName = "bucketName";
Bucket bucket = storage.get(bucketName);
BlobId blobId = BlobId.of(bucket.getName(), blobName);
InputStream inputStream = new FileInputStream(filePath);
BlobInfo blobInfo = BlobInfo.newBuilder(blobId).setContentType("image/jpeg").build();
Blob blob = storage.create(blobInfo, inputStream);
System.out.println("Image URL : " + blob.getMediaLink());
return blob.getMediaLink();
}
In case you want to store it in a special folder get the blobinfo.name() and append the "/" to it. for e.g if temp.jpg need to be stored with date folders. get the date from date object and format it with date formatter and prepend it
blobinfo.name() = date+"/"+blobinfo.name();
will classify all images date wise..

File sending Via Java

My code is:
public class RemotePlay {
static final String USER_NAME = "bwisniewski";
static final String PASSWORD = "xxx";
static final String NETWORK_FOLDER = "smb://192.168.1.141/ADMIN$/";
public static void main(String[] args) throws IOException, InterruptedException {
// TODO Auto-generated method stub
String fileContent = "This is a test File";
new RemotePlay().copyFiles(fileContent, "testFile1.txt");
}
public boolean copyFiles(String fileContent, String fileName) {
boolean successful = false;
try{
String user = USER_NAME + ":" + PASSWORD;
System.out.println("User: "+user);
NtlmPasswordAuthentication auth = new NtlmPasswordAuthentication(user);
String path = NETWORK_FOLDER + fileName;
System.out.println("Path: "+path);
SmbFile sFile = new SmbFile(path, auth);
SmbFileOutputStream sfos = new SmbFileOutputStream(sFile);
sfos.write(fileContent.getBytes());
successful = true;
System.out.println("Successful "+successful);
}
catch(Exception e) {
successful = false;
e.printStackTrace();
}
return successful;
}}
How can I change it to send exe file to ADMIN$ share. I prefer to use this method because I have to authenticate to remote pc. If you got better ideas to copy file to ADMIN$ share I am looking forward to hear about it.
Thanks.
sfos.write(fileContent.getBytes());
if your data is text, then why not to use PrintWriter to write down your file
public static void main(String [] args) throws Exception { // temporary
File fileOne = new File("testfile1.txt");
PrintWriter writer = new PrintWriter(fileOne);
// write down data
writer.println("This is a test File");
// free resources
writer.flush();
writer.close();
}
and about the extension, you can use any extension you want while creating the file, it will still hold the data, and can be opened if you renamed it to the correct extension on the hard drive
if you named your file testfile.exe it will still hold your data, but when you double click it it wont work until you rename it to testfile.txt (or it will work if the extension is compatible with the data in the file)

Categories