Insert image into MySQL database - java

I've been trying for days to do this and got absolutely nowhere. I know it can be done, but I've been trawling SO for answers and got nothing working.
Upload a picture using my REST client
Insert that uploaded picture into the MySQL database.
What I have tried:
Following Load_File doesn't work, I'm using OS X so I don't know how to change ownership of folders etc... how do I do this? I never got an answer in my last post about this. How do I do this?
I've also tried doing it another way: http://examples.javacodegeeks.com/enterprise-java/rest/jersey/jersey-file-upload-example/
This does not work at all. I keep getting the error described in this post: Jersey REST WS Error: "Missing dependency for method... at parameter at index X", but the answer doesn't help me as I still don't know what it should be...
Can anyone please guide me through it?
I'm using a Jersey REST client in Java. Many of the tutorials to do this mention a pom.xml file, I don't have one or know what it is.
Thank you,
Omar
EDIT:
This is the file upload:
package com.omar.rest.apimethods;
import java.io.File;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.InputStream;
import java.io.OutputStream;
import javax.ws.rs.Consumes;
import javax.ws.rs.POST;
import javax.ws.rs.Path;
import javax.ws.rs.core.MediaType;
import javax.ws.rs.core.Response;
import com.sun.jersey.core.header.FormDataContentDisposition;
import com.sun.jersey.multipart.FormDataParam;
#Path("/files")
public class FileUpload {
private String uploadLocationFolder = "/Users/Omar/Pictures/";
#POST
#Path("/upload")
#Consumes(MediaType.MULTIPART_FORM_DATA)
public Response uploadFile(
#FormDataParam("file") InputStream fileInputStream,
#FormDataParam("file") FormDataContentDisposition contentDispositionHeader) {
String filePath = "/Users/Omar/Pictures/" + contentDispositionHeader.getFileName();
// save the file to the server
saveFile(fileInputStream, filePath);
String output = "File saved to server location : " + filePath;
return Response.status(200).entity(output).build();
}
// save uploaded file to a defined location on the server
private void saveFile(InputStream uploadedInputStream,
String serverLocation) {
try {
OutputStream outpuStream = new FileOutputStream(new File(serverLocation));
int read = 0;
byte[] bytes = new byte[1024];
outpuStream = new FileOutputStream(new File(serverLocation));
while ((read = uploadedInputStream.read(bytes)) != -1) {
outpuStream.write(bytes, 0, read);
}
outpuStream.flush();
outpuStream.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
Schema for the table (one I created for testing):
image_id: int auto-incrementing PK, picture: BLOB.
I could make it a file link and just load the image on my website but I can't even get that far yet.

I would recommend storing your image in some kind of cheap, well permissioned flat storage like network storage, and then storing a path to that storage location in the database. If you're storing your image as a blob, the database is going to do something similar to this already anyways, but I believe there will be some overhead involved with making the database manage storing and retrieving these images. These images will eat through a lot of your database's disk space, and if you want to add more space for images, adding space to flat storage should be easier than adding space to a database.

Related

how to batch insert data into Google BigQuery from a Java service?

I've read through a few similar questions on SO and GCP docs - but did not get a definitive answer...
Is there a way to batch insert data from my Java service into BigQuery directly, without using intermediary files, PubSub, or other Google services?
The key here is the "batch" mode: I do not want to use streaming API as it costs a lot.
I know there are other ways to do batch inserts using Dataflow, Google Cloud Storage, etc. - I am not interested in those, I need to do batch inserts programmatically for my use case.
I was hoping to use the REST batch API but it looks like it is deprecated now: https://cloud.google.com/bigquery/batch
Alternatives that are pointed to by the docs are:
https://cloud.google.com/bigquery/docs/reference/rest/v2/tabledata/insertAll REST request - but it looks like it will be working in the streaming mode inserting one row at a time (and cost a lot)
a Java client library: https://developers.google.com/api-client-library/java/google-api-java-client/dev-guide
After following through the links and references I ended up finding this specific API method promising: https://googleapis.dev/java/google-api-client/latest/index.html?com/google/api/client/googleapis/batch/BatchRequest.html
with the following usage pattern:
Create an BatchRequest object from this Google API client instance.
Sample usage:
client.batch(httpRequestInitializer)
.queue(...)
.queue(...)
.execute();
Is this API using the batch mode, not streaming one, and is the right way to go ?
thank you!
The "batch" version of writing data is called a "load job" in the Java client library. The bigquery.writer method creates an object which can be used to write data bytes as a batch load job. Set the format options based on the type of file you'd like to serialize to.
import com.google.cloud.bigquery.BigQuery;
import com.google.cloud.bigquery.BigQueryException;
import com.google.cloud.bigquery.BigQueryOptions;
import com.google.cloud.bigquery.FormatOptions;
import com.google.cloud.bigquery.Job;
import com.google.cloud.bigquery.JobId;
import com.google.cloud.bigquery.JobStatistics.LoadStatistics;
import com.google.cloud.bigquery.TableDataWriteChannel;
import com.google.cloud.bigquery.TableId;
import com.google.cloud.bigquery.WriteChannelConfiguration;
import java.io.IOException;
import java.io.OutputStream;
import java.nio.channels.Channels;
import java.nio.file.FileSystems;
import java.nio.file.Files;
import java.nio.file.Path;
import java.util.UUID;
public class LoadLocalFile {
public static void main(String[] args) throws IOException, InterruptedException {
String datasetName = "MY_DATASET_NAME";
String tableName = "MY_TABLE_NAME";
Path csvPath = FileSystems.getDefault().getPath(".", "my-data.csv");
loadLocalFile(datasetName, tableName, csvPath, FormatOptions.csv());
}
public static void loadLocalFile(
String datasetName, String tableName, Path csvPath, FormatOptions formatOptions)
throws IOException, InterruptedException {
try {
// Initialize client that will be used to send requests. This client only needs to be created
// once, and can be reused for multiple requests.
BigQuery bigquery = BigQueryOptions.getDefaultInstance().getService();
TableId tableId = TableId.of(datasetName, tableName);
WriteChannelConfiguration writeChannelConfiguration =
WriteChannelConfiguration.newBuilder(tableId).setFormatOptions(formatOptions).build();
// The location and JobName must be specified; other fields can be auto-detected.
String jobName = "jobId_" + UUID.randomUUID().toString();
JobId jobId = JobId.newBuilder().setLocation("us").setJob(jobName).build();
// Imports a local file into a table.
try (TableDataWriteChannel writer = bigquery.writer(jobId, writeChannelConfiguration);
OutputStream stream = Channels.newOutputStream(writer)) {
// This example writes CSV data from a local file,
// but bytes can also be written in batch from memory.
// In addition to CSV, other formats such as
// Newline-Delimited JSON (https://jsonlines.org/) are
// supported.
Files.copy(csvPath, stream);
}
// Get the Job created by the TableDataWriteChannel and wait for it to complete.
Job job = bigquery.getJob(jobId);
Job completedJob = job.waitFor();
if (completedJob == null) {
System.out.println("Job not executed since it no longer exists.");
return;
} else if (completedJob.getStatus().getError() != null) {
System.out.println(
"BigQuery was unable to load local file to the table due to an error: \n"
+ job.getStatus().getError());
return;
}
// Get output status
LoadStatistics stats = job.getStatistics();
System.out.printf("Successfully loaded %d rows. \n", stats.getOutputRows());
} catch (BigQueryException e) {
System.out.println("Local file not loaded. \n" + e.toString());
}
}
}
Resources:
https://cloud.google.com/bigquery/docs/batch-loading-data#loading_data_from_local_files
https://cloud.google.com/bigquery/docs/samples/bigquery-load-from-file
system test which writes JSON from memory

Convert large file to base64 representation in Java; OutOfMemory Exception

I have a situation in which I need to transmit an object from back-end to front-end in this format:
{
filename: "filename",
type: "type",
src: "src",
bytes: "base64Representation"
}
The bytes property of the object consists in the base64 representation of a file stored in a repository in the remote server. Up until now I've worked with small files in the range 1-2MB and the code for converting a file to the corresponding base64 representation has worked correctly. But now I'm facing some problems with big files, larger than 100MB. I've checked solutions that try to convert the file chunk by chunk, but still at the end of the process I need all the chunks concatenated in a string and at this step I'm getting an OutOfMemory exception. I've also seen some suggestions to use OutputStreams, but I can't apply them because I need the data in the above format. Please does anyone have any suggestions on how can I bypass this situation?
You can use OutputStream and process on the fly in a servlet by wrapping response.getOutputStream(). I will give a working example with spring boot. I tested and it works.
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RequestMethod;
import org.springframework.web.bind.annotation.RestController;
import javax.servlet.http.HttpServletResponse;
import java.io.FileInputStream;
import java.io.IOException;
import java.io.OutputStream;
import java.util.Base64;
#RestController
public class Base64Controller {
#RequestMapping(value = "/base64", method = RequestMethod.GET)
public void getBase64File(HttpServletResponse response) throws IOException {
response.setContentType("text/plain");
OutputStream wrap = Base64.getEncoder().wrap(response.getOutputStream());
FileInputStream fis = new FileInputStream("./temp.txt");
int bytes;
byte[] buffer = new byte[2048];
while ((bytes=fis.read(buffer)) != -1) {
wrap.write(buffer, 0, bytes);
}
fis.close();
wrap.close();
}
}
A JSON response is a kludge here, with Base64 having a payload of 6/8th per byte, you have 33% more data transfer as needed. Indeed a JSON DOM object is overstretching both the server as also the client side.
So convert it to a simple binary download, and stream it out; possibly throttled for large data.
This means a change in the API.
I never worked with struts, so i'm not sure will this work, but it should be something like that
public class DownloadB64Action extends Action{
private final static BUFFER_SIZE = 1024;
#Override
public ActionForward execute(ActionMapping mapping, ActionForm form,
HttpServletRequest request, HttpServletResponse response)
throws Exception {
response.setContentType("text/plain");
try
{
FileInputStream in =
new FileInputStream(new File("myfile.b64"));
ServletOutputStream out = Base64.getEncoder().wrap(response.getOutputStream());
byte[] buffer = new byte[BUFFER_SIZE];
while(in.read(buffer, 0, BUFFER_SIZE) != -1){
out.write(buffer, 0, BUFFER_SIZE);
}
in.close();
out.flush();
out.close();
}catch(Exception e){
//TODO handle exception
}
return null;
}
}
to make it JSON structure like you need, you might try to write directly to response.getOutputStream() "{\"filename\":\"filename\",\"type\":\"type\",\"src\":\"src\",\"bytes\": \"".getBytes() before b64 payload and "\"}".getBytes() after
}

Generate CSV from Java object and move to Azure Storage without intermediate location

Is it possible to create a file like CSV from Java object and move them to Azure Storage without using temporary location?
According to your description , it seems that you want to upload a CSV file without taking up your local space. So, I suggest you use stream to upload CSV files to Azure File Storage.
Please refer to the sample code as below :
import com.microsoft.azure.storage.CloudStorageAccount;
import com.microsoft.azure.storage.file.CloudFile;
import com.microsoft.azure.storage.file.CloudFileClient;
import com.microsoft.azure.storage.file.CloudFileDirectory;
import com.microsoft.azure.storage.file.CloudFileShare;
import com.microsoft.azure.storage.StorageCredentials;
import com.microsoft.azure.storage.StorageCredentialsAccountAndKey;
import java.io.File;
import java.io.FileInputStream;
import java.io.StringBufferInputStream;
public class UploadCSV {
// Configure the connection-string with your values
public static final String storageConnectionString =
"DefaultEndpointsProtocol=http;" +
"AccountName=<storage account name>;" +
"AccountKey=<storage key>";
public static void main(String[] args) {
try {
CloudStorageAccount storageAccount = CloudStorageAccount.parse(storageConnectionString);
// Create the Azure Files client.
CloudFileClient fileClient = storageAccount.createCloudFileClient();
StorageCredentials sc = fileClient.getCredentials();
// Get a reference to the file share
CloudFileShare share = fileClient.getShareReference("test");
//Get a reference to the root directory for the share.
CloudFileDirectory rootDir = share.getRootDirectoryReference();
//Get a reference to the file you want to download
CloudFile file = rootDir.getFileReference("test.csv");
file.upload( new StringBufferInputStream("aaa"),"aaa".length());
System.out.println("upload success");
} catch (Exception e) {
// Output the stack trace.
e.printStackTrace();
}
}
}
Then I upload the file into the account successfully.
You could also refer to the threads:
1.Can I upload a stream to Azure blob storage without specifying its length upfront?
2.Upload blob in Azure using BlobOutputStream
Hope it helps you.

OpenShift Java - Use image out of Data dir

I'm trying to create an upload-image button and afterward showing the image on a different jsp page.
I want to do this by uploading into the app-root/data/images folder. This works with the below filepath: filePath = System.getenv("OPENSHIFT_DATA_DIR") + "images/";
But how can I show this image on my jsp? I tried using:
<BODY>
<h1>SNOOP PAGE</h1>
Ga weer terug
<% String filepath = System.getenv("OPENSHIFT_DATA_DIR") + "images/";
out.println("<img src='"+filepath+"logo21.jpg'/>");
%>
<img src="app-root/data/images/logo21.jpg"/>
</BODY>
Both these options don't work. I also read that I need to create a symbolic link. But when I'm in my app-root/data or app-root/data/images or in app-root the command ln -s returns missing file operand
The logo21.jpg does show up in my Git bash
#developercorey is right (gave you +1 👍), I just feel the need to explain why:
Your uploaded images ends up in a folder on your server
(String filepath = System.getenv("OPENSHIFT_DATA_DIR") + "images/" is the folder path in the server).
Your rendered HTML "<img src='"+filepath+"logo21.jpg'/> get sent to the client (the user's browser), with the server's filepath url.
Obviously, when the user's browser try to locate the image, using the path of the server, which doesn't exist on the local machine, it won't work.
The best solution, as #developercorey suggested, is to add a new servlet or a filter to serve photos from the OPENSHIFT_DATA_DIR folder:
You'll have a new url mapped to the servlet serving your photo, something like http://your-server/uploaded/
And you can use <img src="http://your-server/uploaded/logo21.jpg" /> in your jsp.
Here's the snippet from How-To: Upload and Serve files using Java Servlets on OpenShift
import java.io.File;
import java.io.FileInputStream;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.InputStream;
import java.io.OutputStream;
import java.io.PrintWriter;
import javax.activation.MimetypesFileTypeMap;
import javax.servlet.ServletException;
import javax.servlet.annotation.MultipartConfig;
import javax.servlet.annotation.WebServlet;
import javax.servlet.http.HttpServlet;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
import javax.servlet.http.Part;
#WebServlet(name = "uploads",urlPatterns = {"/uploads/*"})
#MultipartConfig
public class Uploads extends HttpServlet {
private static final long serialVersionUID = 2857847752169838915L;
int BUFFER_LENGTH = 4096;
protected void doPost(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException {
PrintWriter out = response.getWriter();
for (Part part : request.getParts()) {
InputStream is = request.getPart(part.getName()).getInputStream();
String fileName = getFileName(part);
FileOutputStream os = new FileOutputStream(System.getenv("OPENSHIFT_DATA_DIR") + fileName);
byte[] bytes = new byte[BUFFER_LENGTH];
int read = 0;
while ((read = is.read(bytes, 0, BUFFER_LENGTH)) != -1) {
os.write(bytes, 0, read);
}
os.flush();
is.close();
os.close();
out.println(fileName + " was uploaded to " + System.getenv("OPENSHIFT_DATA_DIR"));
}
}
protected void doGet(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException {
String filePath = request.getRequestURI();
File file = new File(System.getenv("OPENSHIFT_DATA_DIR") + filePath.replace("/uploads/",""));
InputStream input = new FileInputStream(file);
response.setContentLength((int) file.length());
response.setContentType(new MimetypesFileTypeMap().getContentType(file));
OutputStream output = response.getOutputStream();
byte[] bytes = new byte[BUFFER_LENGTH];
int read = 0;
while ((read = input.read(bytes, 0, BUFFER_LENGTH)) != -1) {
output.write(bytes, 0, read);
output.flush();
}
input.close();
output.close();
}
private String getFileName(Part part) {
for (String cd : part.getHeader("content-disposition").split(";")) {
if (cd.trim().startsWith("filename")) {
return cd.substring(cd.indexOf('=') + 1).trim()
.replace("\"", "");
}
}
return null;
}
}
The best way to serve user uploaded images that you are storing in your OPENSHIFT_DATA_DIR would be to use a servlet as described here: https://forums.openshift.com/how-to-upload-and-serve-files-using-java-servlets-on-openshift?noredirect
This servlet basically takes the path/name of the image that is being requested, reads it from the filesystem and then serves it to the requester.
The OPENSHIFT_DATA_DIR directory is not web-accessible. You can make images stored in the OPENSHIFT_DATA_DIR (aka app-root/data) directory web-accessible by creating a symlink to them from the publicly accessible OPENSHIFT_REPO_DIR.
For one-time use, as a proof of concept:
rhc ssh -a <your_app_name> -n <your_namespace>
ln -sf ${OPENSHIFT_DATA_DIR}images ${OPENSHIFT_REPO_DIR}images
You should now be able to access logo21.jpg at https://<your_app_name>-<your_namespace>.rhcloud.com/images/logo21.jpg, or <img src="/images/logo21.jpg"/>.
The contents of the OPENSHIFT_REPO_DIR are overwritten when you push changes, so you'll want to create the symlink with a deploy hook to re-create it each time you deploy. In .openshift/action_hooks/deploy:
#!/bin/bash
# This deploy hook gets executed after dependencies are resolved and the
# build hook has been run but before the application has been started back
# up again.
# create the images directory if it doesn't exist
if [ ! -d ${OPENSHIFT_DATA_DIR}images ]; then
mkdir ${OPENSHIFT_DATA_DIR}images
fi
# create symlink to uploads directory
ln -sf ${OPENSHIFT_DATA_DIR}images ${OPENSHIFT_REPO_DIR}images
You can upload the file to the DATA DIRECTORY, then copy the file from the DATA DIRECTORY to any folder in the HOME DIRECTORY.
Thereafter you should be able to reference the image as usual in your page but it appears Openshift only displays items from a previous deployment or git push, therefore perhaps it is best to save the file in a database then read it directly from that database.

Base64 String corrupt from Java

I have a phonegap plugin I altered. The Java part outputs a base64 string:
package org.apache.cordova;
import java.io.ByteArrayOutputStream;
import java.io.File;
import org.apache.cordova.api.Plugin;
import org.apache.cordova.api.PluginResult;
import org.json.JSONArray;
import android.annotation.TargetApi;
import android.graphics.Bitmap;
import android.os.Environment;
import android.util.Base64;
import android.view.View;
public class Screenshot extends Plugin {
#Override
public PluginResult execute(String action, JSONArray args, String callbackId) {
// starting on ICS, some WebView methods
// can only be called on UI threads
final Plugin that = this;
final String id = callbackId;
super.cordova.getActivity().runOnUiThread(new Runnable() {
//#Override
#TargetApi(8)
public void run() {
View view = webView.getRootView();
view.setDrawingCacheEnabled(true);
Bitmap bitmap = Bitmap.createBitmap(view.getDrawingCache());
view.setDrawingCacheEnabled(false);
File folder = new File(Environment.getExternalStorageDirectory(), "Pictures");
if (!folder.exists()) {
folder.mkdirs();
}
File f = new File(folder, "screenshot_" + System.currentTimeMillis() + ".png");
System.out.println(folder);
System.out.println("screenshot_" + System.currentTimeMillis() + ".png");
ByteArrayOutputStream baos = new ByteArrayOutputStream();
bitmap.compress(Bitmap.CompressFormat.PNG, 100, baos);
byte[] b = baos.toByteArray();
String base64String = Base64.encodeToString(b, Base64.DEFAULT);
String mytextstring = "data:image/png;base64,"+base64String;
System.out.println(mytextstring);
that.success(new PluginResult(PluginResult.Status.OK, mytextstring), id);
}
});
PluginResult imageData = new PluginResult(PluginResult.Status.NO_RESULT);
imageData.setKeepCallback(true);
System.out.println("imageData=============>>>>>"+imageData);
return imageData;
}
}
I then pass this to some Javascript and then send the string to a server. I have checked the string that the .php file receives, and the base64 string is identical. However when I decode the base64 string it seems corrupt. For a better example copy the contents of this text file into a decoder.
http://dl.dropbox.com/u/91982671/base64.txt
Note: When the .php file tries to decode it data:image/png;base64, is infront, I have just removed it for the ease of you pasting it into a decoder.
Decoder found here:
http://www.motobit.com/util/base64-decoder-encoder.asp
All I can think is that for some reason I may not be outputting the base64 string correctly from the Java. Does anyone have any idea whats going on? Or what may cause this?
I played about with this for a good few hours last night and took some of these suggestions into consideration.
Firstly I checked the image before I encoded it. It was fine.
However decoding it before it goes to the Javascript showed that it was corrupted, this meant it had to be something to do with the Java encoding process. To solve this, and I don't claim to 100% understand why it happens, but the the problem seems to lay with this code:
String mytextstring = "data:image/png;base64,"+base64String;
and the way I was adding "data:/image/png;base64," before I sent it to the Javascript and on to the PHP decoder. To resolve this I removed it from the Java code so it became:
String mytextstring = base64String;
And in my JavaScript function that sent it to the server I added it to the string there, this works and I received an uncorrupted image. Just in-case anyone wonders/cares the Javascript function where I add it instead is below:
function returnScreenshotImage(imageData) {
base64string = "data:image/png;base64,"+imageData;
console.log("String: "+base64string);
var url = 'http://www.websitename.co.uk/upload.php';
var params = {image: imageData};
document.basicfrm.oldscreenshotimg.value = document.basicfrm.screenshotimg.value;
// send the data
$.post(url, params, function(data) {
document.basicfrm.screenshotimg.value = data;
});
}
As you can see the line:
base64string = "data:image/png;base64,"+imageData;
Adds the section previously added by the Java. This works now. Hope this helps people in the future. If anyone would care to comment ad explain why this is if they know feel free. :)

Categories