Inserting millions of records from local to godaddy hosting - java

I'm trying to insert about 8 million records from an access database to a mysql database in GoDaddy.
I built a desktop java app to manage the inserts chunking 5000 records each time. I use Laravel 5.1 to manage the inserts in the server. So, basically, the java app send a request to a laravel php route, then it take care of inserting directly in my MySQL database.
Edit: people will insert data directly into the Access, so I have to watch the MDB file for changes. This is why I can't just export from Access and import into the MySQL.
The first part of records inserts successfully, but than when I send another request, I get this error:
2015-10-28 10:43:57.844 java[3475:280209] Communications error: <OS_xpc_error: <error: 0x7fff7298bb90> { count = 1, contents =
"XPCErrorDescription" => <string: 0x7fff7298bf40> { length = 22, contents = "Connection interrupted" }
}>
Got xpc error message: Connection interrupted
org.apache.http.NoHttpResponseException: api.mydomain.com:80 failed to respond

Make sure you have MySQL odbc connector is installed:
https://dev.mysql.com/downloads/connector/odbc/
create your DNS to your MySQL server (you can do this via odbc manager or just with a notepad)
your DNS file will look like this:
[ODBC]
DRIVER=MySQL ODBC 5.3 Unicode Driver ' check what driver is installed
UID=username to the server
PORT=3306
PWD= password to the server
DATABASE=dbname
SERVER=serverip/name
Save the DNS file somewhere lets call it GoDaddy_MySQL.dns
Open up you ACCESS database.
External Data
ODBC
Select "link to the data source by creating"
select the godaddy_mysql.dns file
if all your connection details are correct Access will show you tables & views to import. Click the tables you would like to upload data from your Access dtabase.
Now you have linked the actual MySQL table in your MS Access database.
All you need to do is uploading data from your local table to your linked table simply by:
you can chunk your upload by using the TOP keyword. if you add a where condition with (is not already on the linked table) you can always upload new records automatically to your MySQL server.
if you are and will still use your Access database you can also switch from local to linked tables so all new entry will automatically uploaded to your godaddy server.
Pseudo:
insert into linked_table select top 5000 from your local table where local_records_are not in linked table.

Try a staggered MySQL importer that is run on the GoDaddy server. Like Bigdump: http://www.ozerov.de/bigdump/
Export from Access, import to MySQL. It's run directly on the godaddy server, so the data is already on the server and then just read. I've used this numerous times on GoDaddy for large imports.
Cheers!

Related

SQLite functions at android apps

I'm very new at programming. I want to make an app where,
1. User1 will make an order (log in as user_profile_1)
2. User2 will work with that order (log in as user_profile_2)
One user shouldn't have access to other's data.
Will SQLite allow me to do that? Seems, database must not run on user's device. And database will always get new records, so we need to update it all the time.
For this case, you have to create a server to store data, I think you should do these things:
Use PHP, NodeJS or some languages server like this to create APIs.
Use a MySQL database or some other database to store data on your server.
Then, different users can log in on their device and you will call your APIs to show what you want.
If you use a SQLite database, the db will be local in the device. The records in the database do not share with other devices.
If you want to access the database from different devices the database will need to exist in a remote server.
You can do differents things:
You can use a database MySql in a remote server with allowed remote access.
You can create a rest api in a LAMP server. You can do this for example with PHP and MySql. You create a GET/POST web page, them pass the data, and the php save the information in the database. You can get the information from the database in the same way, but you should ask to the db in a periodicly time

How do I recreate db and import JIRA data from previous automated export

I have my own local instance of Atlassian JIRA Issue Server on a hosted server. Unfortunately the hard-drive failed on the hosted server the administrator replaced the drive and recovered the program and data directories, but unfortunately the adminstrator had not actually made a back up of the database :(
Luckily it seems that JIRA automatically does a data export of the data, i have daily zip files such as 2015-Apr-20--0555.zip in
atlassian/application-data/jira/export
But what is the easiest way to import this data ?
The server is starting up but I can't access any admin pages, how do I go about recreating the database (i dont remember any details of how I did this the first time). Should I reinstall jira from scratch or can I fix the existing installation to accept the export file.
if the data exported as SQL then you can easy use MYSQLAdmin to import the data to the New JIRA Database, but if the data is comma separated file then you can use MYSQL command for that, based on what you say you cant do that because you cant reach to the server command windows, then you can create simple PHP file to insert the data from the zip file to the DB this link may help you
http://coyotelab.org/php/upload-csv-and-insert-into-database-using-phpmysql.html

Is it possible to save database file in a project folder of Java?

This post is the continue of my previous question in here. So I had a look into how mySQL works with Java, but I noticed that the computer must have a database server to connect to the application. So what will happen when my software is ready and users want to run in a different computers? Can't I save the database file in the directory of the software, so any copy of the program will be connected to its independent database to save and parse data from it?
Just to make it clear, in a part of my software, I needs to keep record of previous interactions. Like a history table.
Would using JSON a better option in this case?
In a real world generally database servers are installed on a machine and softwares are installed on different machine.
We let software know the database configuration like database URL /database Name /username/Passwords etc (through property file or through JNDI configurationS).Then java program can connect to database with the help of JDBC driver.
Note:- one Database Server can Host many databases.
If you want to distribute your software without having dependency on client database. Then I would recommend you to use some inmemory DB.This DB you can embed with your software.(alternatively you can write logic that if client database can't be found then use inMemory DB..something like this).
H2 db is my favorite one and it also supports persistent mode and it support s many DB dialects including MYSQL .

How to export a java project with a database ( connected to sql server ) for using in different computers?

I'm new in working with databases and connecting java to sql server. I just created a database and a java application for it. I used the sqljdbc4.jar file in my library and all the stuff required to connect the app with the database, but I want to know : what do I need to export so someone in an other computer can use that application and have that database in that computer, without having installed sql server or something, what do I need to do ?
It depends on what you want to do. If you want them to connect to a databse via your app, then you will have to allow them to access the database via your application by providing the privileges. If you want to bundle the whole database with your java code, then you need to use an in application database like h2 http://www.h2database.com/html/main.html. There are alternatives like derby and hsql db, but h2 is better than that. See the comparison on the h2 homepage.

Mysql database backup

we have a mysql table which consist of 300 million rows. The data get inserted in the database frequently and there must be no down time. What is the ideal way to back up these data. Is mysql Enterprise back up is a good option?
Use Percona with innoDB DB Engine. Percona toolkit include innobackupex utility, that can dump your base on the fly.
Or you can place your data folder on LVM partition and create snapshot. But it's slooooow...
Another way - replication. You can setup another mysql server as slave (for read only) and create backups from that second server. But it needs more money =)

Categories