How to upload an image and saved into the database and that image should be shown on the user profile page? The image can be of any type jpg, jpeg and png. I am using JSP, jQuery and Spring MVC framework and Java and Spring data jpa.
I am not using servlet in my application. I am new to this field and not able to complete it.
You need a VARBINARY column to contain the image. Open the file using an InputStream, load byte[]s from it, and write that into the column.
You need another column to save the mime file type. You can obtain the file type with java.nio.file.Files.probeContentType( Path path )
In your response headers, you need to:
Use setContentLength() to set the length of the file.
Use setContentType() to set the mime image type.
If your database offers a means to create an InputStream on a varbinary column, use it. Otherwise, you need to read the contents of your varbinary column into a byte[], and then create a ByteArrayInputStream on the byte[].
Finally, you need to construct a response entity using the constructor that accepts an input stream: return new ResponseEntity(inputStream, httpHeaders, HttpStatus.OK);
I am using ng-file-upload AngularJs API to upload multiple files to server.But this is the traditional way to do it.But my requirement is that i dont need to store files in a server as it is.I have a REST end point that responsible for store user input data to DB.Along with the REST request i pass the file Array object with other forms values.When data comes to REST end point it access each attributes and store data.When it tried to read File Array obj i can not read the file content for each file.
Sample File Upload Code
jsfiddle
Note that i just want to pass only $scope.files along with the REST request.Please let me know how can i read file content values from server side reading file Array in Java.If you guys know any better way to do this please share your ideas.
REST Service Code Snippet
#POST
#Path("/manual")
#Produces(MediaType.APPLICATION_JSON)
public boolean insertResults(testVO testResult) {
for(Object o:testVO.getFiles()){
LinkedHashMap<String, String> l=(LinkedHashMap<String, String>) o;
System.out.println(l.get("result"));
}
}
Note: testVO.getFiles() type is Object[] array.
In my preceding code i convert object into LinkedHashMap and access the necessary fields like size,type,etc.But my requirement is that how can i get the content belong to that file.
I can use the following code to read a single json file but I need to read multiple json files and merge them into one Dataframe. How can I do this?
DataFrame jsondf = sqlContext.read().json("/home/spark/articles/article.json");
Or is there a way to read multiple json files into JavaRDD then convert to Dataframe?
To read multiple inputs in Spark, use wildcards. That's going to be true whether you're constructing a dataframe or an rdd.
context.read().json("/home/spark/articles/*.json")
// or getting json out of s3
context.read().json("s3n://bucket/articles/201510*/*.json")
You can use exactly the same code to read multiple JSON files. Just pass a path-to-a-directory / path-with-wildcards instead of path to a single file.
DataFrameReader also provides json method with a following signature:
json(jsonRDD: JavaRDD[String])
which can be used to parse JSON already loaded into JavaRDD.
function spark.read.json accepts list of file as a parameter.
spark.read.json(List_all_json file)
This will read all the files in the list and return a single data frame for all the information in the files.
Using pyspark, if you have all the json files in the same folder, you can use df = spark.read.json('folder_path'). This instruction will load all the json files inside the folder.
For reading performance, I recommend you for providing dataframe the schema:
import pyspark.sql.types as T
billing_schema = billing_schema = T.StructType([
T.StructField('accountId', T.LongType(),True),
T.StructField('accountName',T.StringType(),True),
T.StructField('accountOwnerEmail',T.StringType(),True),
T.StructField('additionalInfo',T.StringType(),True),
T.StructField('chargesBilledSeparately',T.BooleanType(),True),
T.StructField('consumedQuantity',T.DoubleType(),True),
T.StructField('consumedService',T.StringType(),True),
T.StructField('consumedServiceId',T.LongType(),True),
T.StructField('cost',T.DoubleType(),True),
T.StructField('costCenter',T.StringType(),True),
T.StructField('date',T.StringType(),True),
T.StructField('departmentId',T.LongType(),True),
T.StructField('departmentName',T.StringType(),True),
T.StructField('instanceId',T.StringType(),True),
T.StructField('location',T.StringType(),True),
T.StructField('meterCategory',T.StringType(),True),
T.StructField('meterId',T.StringType(),True),
T.StructField('meterName',T.StringType(),True),
T.StructField('meterRegion',T.StringType(),True),
T.StructField('meterSubCategory',T.StringType(),True),
T.StructField('offerId',T.StringType(),True),
T.StructField('partNumber',T.StringType(),True),
T.StructField('product',T.StringType(),True),
T.StructField('productId',T.LongType(),True),
T.StructField('resourceGroup',T.StringType(),True),
T.StructField('resourceGuid',T.StringType(),True),
T.StructField('resourceLocation',T.StringType(),True),
T.StructField('resourceLocationId',T.LongType(),True),
T.StructField('resourceRate',T.DoubleType(),True),
T.StructField('serviceAdministratorId',T.StringType(),True),
T.StructField('serviceInfo1',T.StringType(),True),
T.StructField('serviceInfo2',T.StringType(),True),
T.StructField('serviceName',T.StringType(),True),
T.StructField('serviceTier',T.StringType(),True),
T.StructField('storeServiceIdentifier',T.StringType(),True),
T.StructField('subscriptionGuid',T.StringType(),True),
T.StructField('subscriptionId',T.LongType(),True),
T.StructField('subscriptionName',T.StringType(),True),
T.StructField('tags',T.StringType(),True),
T.StructField('unitOfMeasure',T.StringType(),True)
])
billing_df = spark.read.json('/mnt/billingsources/raw-files/202106/', schema=billing_schema)
Function json(String... paths) takes variable arguments. (documentation)
So you can change your code like this:
sqlContext.read().json(file1, file2, ...)
I have a Java script, that will get the BLOB data from the database and then email this file to a specific email address. My problem is, that I have to use some framework functions (I can make DB calls only through these) and I think it's not handling BLOB datatypes... All I can get is the string representation of the result, this is the log line result of the code (framework call):
String s = String.valueOf(result.get(j).getValue("BLOB_DATA"));
System.out.println(s);
Log result:
<binary data> 50 KB
So this is the data I have to convert SOMEHOW into a valid pdf file, but right now I'm stuck...
Is it even possible to convert it into a valid byte[]? I tried it several ways, but all I get is invalid files... :(
I have never saved and retrieved an image to and from the database before. I wrote down what I guessed would be the process. I would just like to know if this is correct though:
Save image:
Select & Upload image file from jsp (Struts 2) which will save it as a .tmp file.
Convert the .tmp file to a byte[] array (Java Server-Side)
Store the byte[] array as a blob in the database (Java Server-Side)
Get image:
Get the byte[] array from the database (Java Server-Side)
Convert the byte[] array to an image file (Java Server-Side)
Create the file in a location (Java Server-Side)
Use an img tag to display the file (JSP Client-Side)
Delete the file after it's finished being used? (Java Server-Side)
I'm aware of the fact that it is highly recommended to not save & retrieve images to and from the database. I would like to know how to do it anyway.
Thanks
Almost correct.
It's expensive and not so great to create the file on the fly and then delete it.
Yes, you store it as the raw bytes in the database, but the way to retrieve it and display it to a client machine is to implement a web handler that sets the content-type of the response to the appropriate MIME type and then dumps the bytes out to the response stream.
Yes, You get it right.
Save Image :
The decision to save image is very much dependent on further usage. You have one option to save the file on the file system. The location for saved file should be saved into the metadata in the database table.
Get Image:
You do not have to right file data on any temp location. It can be easily rendered from the database only. Just send a request from client and intercept that request in a spacial designed Servlet. This Servlet will read the file metadata and corresponding file, if successful, write the file back on the response stream.