copy table from teradata to mysql - java

i have searched for hours with no real solution.
I want to setup an ongoing task (everynight). I have a table in a Teradata database on server 1. Everynight i need to copy an entire table from this teradata instance to my development server (server 2) that has MySQL 5.6.
How do i copy an entire table form server 1 to server 2?
Things i have tried:
1)Select all data from table x into ResultSet from teradata server 1. Insert into mysql via preparedStatement. But this is crazy slow. Also i am not sure how to Drop the table and recreate it each night with the schema from the teradata server.
Any help please?

There are a few ways you can do this.
Note: these may be older methods just trying to get you to thinking about how you can do this in your current environment. Plus I am not familiar with your data sensitivity and permissions, etc.
One would be teradata to MySQL via CSV file see the examples and links below. (these could be older posts but the basic ideas are what you need).
Export from teradata:
CREATE EXTERNAL TABLE
database_name.table_name (to be created) SAMEAS database_name.table_name (already existing, whose data is to be exported)
USING (DATAOBJECT ('C:\Data\file_name.csv')
DELIMITER '|' REMOTESOURCE 'ODBC');
Export From Teradata Table to CSV
Edit: If CREATE EXTERNAL TABLE doesn't fly then you may have to use java to extract first and then organize the data...Mimic the current method (however it works) at getting the data. Google-fu with this handy link https://www.google.com/search?q=external+csv+file+teradata&oq=external+csv+file+teradata&
(dnoeth) below recommends this: TPT Export in DELIMITED format (which looks like a hassle...but could be the only way) So here is a link that discusses it: http://developer.teradata.com/tools/articles/external-data-formats-supported-by-the-tpt-dataconnector-operator
Import to mysql (don't drop the table.just delete from table):
mysqlimport --ignore-lines=1 \
--fields-terminated-by=, \
--local -u root \
-p Database \
TableName.csv
http://chriseiffel.com/everything-linux/how-to-import-a-large-csv-file-to-mysql/
You would need to schedule this in both the environments and that could be a huge hassle.
Now I see you use Java and in Java you could create a simple scheduled task via (whatever you have available for scheduling tasks). It is possible that you will have to do trial and error runs and your data could be an issue depending on what it is. How it is delimited, if it has headers, etc.
Then you would call variants of the examples above. Through Java
here is a java example:
http://www.java-tips.org/other-api-tips/jdbc/import-data-from-txt-or-csv-files-into-mysql-database-t-3.html
and another:
How to uploading multiple csv file into mysql database using jsp and servlet?
another:
http://www.csvreader.com/java_csv_samples.php
So the basic idea is exporting your csv from teradata which should be simple.
Using Java to traverse the file server to get your file (you may need to FTP somewhere) you may need to consider this....
Consume your CSV file using Java and either a UDF or some package that can iterate over your CSV (stuff) and import into MySQL (write one yourself or find one on the "internet of things" as they now call it).
Edit: Here is info on scheduling tasks with java and links to review.
http://docs.oracle.com/javase/6/docs/api/java/util/concurrent/ScheduledExecutorService.html
and check this SO convo...
How to schedule a periodic task in Java?

Related

Export and import data from parse server

I am trying to export and import my database from parse server into / from to xls and csv files. I am using bitnami Parse server with dashboard 1.0.25. Where ever i search it either states Parse has been closed or this functionality is not there on parse server. My question is is there any way to do it without of much hassle thank you and should i keep on using Parse server as it is open source or switch to another as mentioned on many website that many developers are moving to other databases such as mongodb or Firebase etc.
Thank you for your replies.
just call the data from server in your model and export it locally. Why you want to export files on server.
In the shell use mongodump or psql yourdatabasename < db.psql in case of postgres.

Synchronous use of database insertion

I have created a java application that is inserting data to a mysql database. Under some conditions, i need to post some of these data via email, using a java application that I will write as well.
My problem is that i am not sure how i should implement this.
From what I understand, I could use a UDF inside MySql to execute a java application, for which there are many against opinions into using it. Let alone that both the database and the mail client application will reside in a VM that i dont have admin access, and dont want to install anything that neither me nor the admin knows.
My other alternative, that I can think of, is to set up the mail client (or some other application), to run every minute just to check for newly inserted data. Is this a better aproach? Isn't it going to use resources for doing almost nothing. At the moment the VM might not be heavily loaded, but i have no idea how many applications there might end up running on the same machine.
Is there any other alternative i should consider using?
You also need to consider the speed of internet, database server load, system resources. If you have enough memory and less load to insert data in databases or database load is not so much then you can approach this by cron setup. For linux call a script for every 5 minutes. The script perform the following-
1. Fetch unread Emails as files
3. Perfrom shell script to read needed data.
3. write data to mysql
4. Delete the email
If you have heavy loaded system then wise you need to do this once or twice in an hour or may vary.

How do I recreate db and import JIRA data from previous automated export

I have my own local instance of Atlassian JIRA Issue Server on a hosted server. Unfortunately the hard-drive failed on the hosted server the administrator replaced the drive and recovered the program and data directories, but unfortunately the adminstrator had not actually made a back up of the database :(
Luckily it seems that JIRA automatically does a data export of the data, i have daily zip files such as 2015-Apr-20--0555.zip in
atlassian/application-data/jira/export
But what is the easiest way to import this data ?
The server is starting up but I can't access any admin pages, how do I go about recreating the database (i dont remember any details of how I did this the first time). Should I reinstall jira from scratch or can I fix the existing installation to accept the export file.
if the data exported as SQL then you can easy use MYSQLAdmin to import the data to the New JIRA Database, but if the data is comma separated file then you can use MYSQL command for that, based on what you say you cant do that because you cant reach to the server command windows, then you can create simple PHP file to insert the data from the zip file to the DB this link may help you
http://coyotelab.org/php/upload-csv-and-insert-into-database-using-phpmysql.html

Mysql database backup

we have a mysql table which consist of 300 million rows. The data get inserted in the database frequently and there must be no down time. What is the ideal way to back up these data. Is mysql Enterprise back up is a good option?
Use Percona with innoDB DB Engine. Percona toolkit include innobackupex utility, that can dump your base on the fly.
Or you can place your data folder on LVM partition and create snapshot. But it's slooooow...
Another way - replication. You can setup another mysql server as slave (for read only) and create backups from that second server. But it needs more money =)

MySQL: Backup Remote Database

I am working with MySQL. I need to do Remote Backup of database.As of now , i am doing ssh/scp for this operation .Is there any other simplified way to achieve this, like a single command so that it can be used via Java Run Time lib.

Categories