I have a GAE application (Java) and I have to populate my datastore from an external file. Using localhost it's working fine. The problem is while deploying it I always get
Error: Server Error
The server encountered an error and could not complete your request.
If the problem persists, please report your problem and mention this
error message and the query that caused it.
This is my path file "war\WEB-INF\test.data"
Question: Is there is any change between local and distant access?
any help?
We need to understand what you are executing to populate the datastore from the local file. Here are some points:
Assuming that you have uploaded the file in WEB-INF folder with name test.data, your file path should be `WEB-INF/test.data'
Are you running the code to load data via some url e.g. http://yorappid.appspot.com/loaddata or something like that? If yes, chances are that your code is taking much longer to process than the 60 seconds hard limit that AppEngine places on completing HTTP Requests. So that could be the problem where you request is not getting complete.
I suggest that if point 2 above is the case, please move your code to a Cron Job. They have a limit of 10 minutes and it might sufficient to load up your data. I am not sure about the amount of data that you plan to load but 10 minutes would be more than enough to load up a sufficiently large amount of data.
Hope this helps.
Related
I have a Java Web application that generates a report and I have the ability to export that report to an excel file, problem is whenever I generate it as an excel file a "Connection Timed Out" page is being displayed on a firefox web browser.
Basically I have no idea why is this happening, I see no problems in my code could it be server issues or the amount of data I'm generating? Also no error logs are being displayed.
Any advise, suggestions would be of great help thanks.
It sounds like the request is taking too long, and being timed out. Basically it's taking too long to generate the report. This could be too long for the client, the app server or the webserver (if you have a separate webserver). You have a few options:
Find out where the timeout settings are in the Application Server and increase them
Speed up your report writing code so it doesn't take as long
Make the report writer an asyncronous job (eg by kicking of the report generation in a new thread), and have the client pole the server until it's finished, then request the file.
Update based on OP comment:
Regarding the last suggestion:
If the report's generated by another thread, the current request will return before the report is generated, so the browser won't have to wait at all. However, this is quite a large amount of work because you have to have a way for the client-side code to find out when the report is finished. Also, you are not supposed to launch your own threads from a Servlet.
Maybe you can make the original request via AJAX, or in an iFrame? This way the restrictive timeout threshold may not be in effect.
We develop an application which uploads some CSV file.
In order to be sure about our code, the upload has been tested with 2 differents framework : ZK (which manages upload itself) and with classic jsp/Spring REST.
On our local server (windows, tomcat 5.5) all is ok.
On client system (Unix Solaris 10, tomcat 5.5) we have a pb : the first time the file is correctly uploaded, the second time if we change something in data (even if we delete the file) we have the same file as first upload....
It seems a cache or something else disturb the upload.
Any idea ?
Thank you.
[Edit] Additional information
For information, we are on Citrix Metaframe Program Neighborhood (a old version -> v9.0).
For those present at the customer (with or without Citrix), CSV file are uploaded correctly each time.
For us, who are outside, that's not working.
File A is uploaded, then we modified it (A') then uploaded again...and the result is : file A is deleted (as expected, by programmation) then a new file appear which is the same as A (not A' as expected).
If we stop Tomcat or even make others http request, the upload works correctly.
We test upload with 2 differents framework : ZK (which manage the upload itself) and Spring MVC (REST). Both are working on our servers with same Tomcat (5.5).
Other thing strange, we have access to an another server (by VPN not Citrix) where we deployed the application on a Tomcat 7 (already installed by the client). All is OK.
Is it possible that is an hardware problem? with a router...
First of all, it is very difficult to understand your question. With what I understood, you are not able to load any file the second time as the details of the first file are still present in memory/variables. Post your code so that it will be easy.
Try these
Start the application, load a file, say A.csv, first time, then stop
the application
Start the application again, and load another file B.csv and see if it is loaded correctly.
If steps 1 and 2 work correctly, you can be sure that no one has hard-coded anything in the code.
Now, go through your code and see if you have any static variables, being set with the contents of the file.
If removing static variables doesn't work, try printing all the variables and narrow down the issue.
Good luck!
Last couple of days I was trying to copy data entities from one appengine application to another. I read that this can't be done for Java application. I made some more research to do that by cloud storage patg but found out that I can't do that between two applications.
Can anybody help on that, and what steps should I do that so can copy data entities from application to another, take in consideration that I want to do that from UI not code.
Now when I try to copy data from app1 to app 2 from Datastore admin it gives :
"There was a problem kicking off the jobs.
The error was:
Fetch to https://blablabla.appspot.com/_ah/remote_api failed with status 404
Back to Datastore Admin"
Please help.
Thanks in advance.
Mohammed.
To restore backup data from a source application to a target application:
Create an access control list (ACL) on the source application's storage bucket with the following permission:
User
[PROJECT_ID]#appspot.gserviceaccount.com
Reader
where [PROJECT_ID] is the project ID of the target application.
https://cloud.google.com/appengine/docs/python/console/datastore-backing-up-restoring#restoring_data_to_another_app
I'm working on a website and i need to do the functionality for allowing a user to upload the videos. I tried doing it but at run time NetBeans came an error of max length exceeded i tried resolving it by writing max_allowed_packet=100 M in my.ini file in mysql but it did not help as it gave a message of access denied.
Can you please guide me how to get it fixed?
Don't do that. Use the filesystem to store multi-megabyte videos, and use the database to store filenames and paths. A huge video file in a database is also less efficiently served than a video stored directly in a file. Webservers can make use of server calls like sendfile and splice to serve large files efficiently, but the database adds several copies and delays.
That being said, you need to set max_allowed_packet on both the client and server sides.
We have implemented one web-application as a scheduler which sends email campaigns for the configured mailing lists. It processes contacts one by one. how can I recover the crashing point to restart my campaign process from where it was stopped.
Ex: I have configured 100 emailIds to the mailing list.
after processing 50 emailIds, the server shuts down or crash occurred.
when I restart the server, again it starts from 1st emailId instead of 51st emailId.
We have tried some solutions based on our application logic but that created performance issues. Is there any common solution that can be handled at the server level?
Can u please suggest some solution?
For example you can save the number to a file and update it after each processed mail and read it on start-up. But what you really need to ask yourself is why the server is crashing..
Try to save the email IDs in the text file and read one by one. Within my knowledge it's best approach. Otherwise, store it in to the XML file and read that. If you are using XML parser it doesn't have the heavy weight and your server might not be hanged.