I have a java application running on Apache tomcat on two different servers A and B. The application involves uploading and downloading files mostly pdf and images. Currently I have an FTP server F ,where I host all my files. Now I am having the following problems:
Uploading and Downloading of files is causing issues while creating FTP connection (Sometimes it connects and Sometimes it throws the timeout error).
I am displaying images by converting them into BASE 64 format, which causes the same trouble discussed above.
Solutions that I can think of is
Use application server to host files (Is it a right practice??),
also as I have two different servers running the application it
would be tough to create a sync between them.
I have heard something about shared file hosting but that will cause security troubles.
Any solutions for my above problem would be really appreciated.Thanks
If your application uses a database, you could store these files as LOBs (Character or binary large objects)in the database instead of on disk.
If the files are small you can store them as CLOB or BLOB in a database and serve them through HTTP (rest endpoints from your application server)
If your files are large, store them in a NAS or any other shared storage. Don't convert them to BASE64 instead serve them as binary attachments over HTTP (rest endpoints from your application server). You may or may not store the file locations somewhere maybe in Database to keep track of it.
Related
I'm currently developing a Tracking System with JavaFX and MySQL as database that is kept in a server. My application is used within a network and allows users to upload and download pictures and several types of document.
My question: What is the best way to send and retrieve files to a server within a network? And should I store those files in MySQL or just their paths in MySQL? If only files' paths, do I need FTP or other techniques? I need detail answer because it is my first time to develop such application.
Edit: I want to store data in a server. I build this application for client machines so that clients can keep documents in the server and they can access their files from any machines... I have no idea for how to transfer files from client machines to server. Please help me!
You should use socket programming like what is at following link:
http://www.rgagnon.com/javadetails/java-0542.html
.But if you are using Java7, Files is the best one instead of BufferedInputStream or FileInputStream (No extra library require):
/* You can get Path from file also: file.toPath() */
Files.copy(InputStream in, Path target)
Files.copy(Path source, OutputStream out)
I have developed a website, one of its operation is to read and write data to text files stored at my local machine such as D://test.txt or C://file.txt, but now I am going to host my website at the external server, i mean over the internet use, i wonder where to keep these files that are associated with read and writing operations. At present I am getting an exception file not found if i am using my local machine location. For your information, I am using GlassFish server.
You will want to create a system property on Glassfish, which represents the file path and name. Then upload the file to that location of your choosing on the server where your website application is deployed.
Depending upon your needs, you may find it easier to deploy the file out with your application. Make sure the file is on the classpath, and you can load it using any number of ways.
We have one web application(With Spring, hibernate and MySQL as a Database) in which multiple users can store the heavy videos(pre-recorded or record from application itself) on server at same time.
In that scenario, server load would be definitely more there. We are assuming there would be 500-2000 users in the application.
So what strategy i should use to reduce the load from server and make the response time faster.
1) Storing the videos on our server(With large Disk Space), and using the ActiveMQ/RabbitMQ mechanisms for File Upload and download in the Queues.
2) Storing the videos on some third party server(like YouTube,vimeo etc) that will upload all the videos on one central account. I had recently check this thing with you tube and vimeo but they require the end user login credentials for each upload. And i don;t want in my application that end-users to provide their credentials before each upload.
Is there any other way to reduce the work load and make the response time better for simultaneously upload on server, then please guide.
Thanks In Advance,
Arun
Multi servers can help.
On a single server:
If you use a single core processor - only ONE client will get served.
If you use a multi core processor and you are oppening a new thread for a new connection - only #ofCores clients will get served, and even that is not correct because your local memory might run out before your os will save the data to your local hard disk (which has one bus), so serving 500-2000 clients leads you to a multi server solution.
In my webapp users can download the files among themselves. If a user A has shared a file F , then user B after connecting to A can download the file F from A. Till now each user makes a simple HTTP connection like :xxx.xxx.xxx.xxx/FileList with another user. The file resides on the local hard disk of each user. So that a user can download a file there were two options in my mind.
As the user shares a file,copy that file into the web-app directory of the server,so that the download link becomes as simple as Click to download.
Run a separate FTP server on each node.
I don't know which one of these is a better option but the first one seems very simple to me. What are the ways each client can share the files,without having to copy the stuff somewhere in the webapp directory. How in this case I can use a P2P protocol ?
NOTE : I am using Tomcat 7.
Real P2P is impossible without opening a listening socket on the client machine (that imposes you have to install something on client machine).
If you don't want to STORE the files on the server, I would rather recommend a "connection server", which serves as a gateway between the two users. User A will upload, user B will download at the same time, all you need is to make the bytebuffer in memory. The downloaded bytes can be dropped.
You can write a small client-side program in any language for updating the available files, and receiving the upload request from the server side (also execute the upload)
I would recommend using TCP sockets for upload to the server side, this way you have direct control over the uploaded bytes (streams).
There are some interesting technical issues here (blocking streams, metadata (filename, length, createdate, ...), data consistency, error handling, etc.) that should be taken into consideration. Nice task.
I don't recommend FTP because you cannot control the authentication and authorisation (who can see the files).
We are trying to improve the application performance we are using the struts2,jsp and webservices and retriving the data from service and display response in jsp page using ajax.
This is taking approximately 15 secs my client is asking reduce this time. So we are implementing the Parallelize downloads across hostnames
How can we load the images and js files from different sub host names?
Please suggest
How can we load the images and js
files from different sub host names?
You use subdomains pointing to different servers.
You can, for example, configure your DNS so that, images.example.org points to Amazon S3 while js.example.org points to a dedicated server in Timbuktu while all the other resources are downloaded from your "main" server(s) (whatever that is).