Access Pervasive/Btrieve DB (DDF + DAT files) from Java - java

I have a folder with *.DDF and *.DAT files that are a pervasive/btrieve database. I am able to open and see the content of the database with DDF Periscope (ddf-periscope.com).
I can export data from each table individually using ddf periscope, and I would like to do the same thing using Java. Access the data in the DB and export them to a CSV file, POJOs or any way I can manipulate the data.
Is this possible?

You can use either JDBC or the JCL interfaces to access the data. You do still need the Pervasive engine but you can use Java. Here is a simple sample for the JDBC driver.
I don't have a JCL sample but there should be one in the Pervasive / Actian Java Class Library SDK.

Related

Generate txt file under /resources and immediatelly read it without rebuilding the app

I have a rather simple question. I have a controller with two endpoints
/newFile
/downloadFile/{fileName}
/newFile endpoint creates a simple .txt file under resources
and my expectation would be to hit /downloadFile/{fileName} and download the newly created resource.
Resource res = new ClassPathResource(fileName);
so as it turned out classpath resource looks under build/resources and the file is not yet there (because I haven't restarted/rebuild the app, once I do it - I can download the file) but the whole idea is to dynamically generate files and access them immediately - is there a quick way to bypass this without having to rebuild?
I too had the same problem when I was working on FACE Recognition API which has the same process of creating a file and and uploading it to analysis.
What java does is It abstracts project to JVM and runs on it, So your newly created file won't be in the JVM, What you need to do is to use a Database or any cloud storage or NFS.
According to my perspective Using Database is the best option. Code Java How to upload files to database with Servlet, JSP and MySQL and Javatpoint store file in Oracle database are some documents you can refer for using a database.
you can refer to this document for implementing your project.

How to load DynamoDB tables from YAML template to Java

I'm testing a service that accesses dynamo DB tables, using test containers.
The necessary tables are being created in java test class before the tests run (partial code below) and everything works fine.
CreateTableRequest request = new CreateTableRequest()
.withTableName(TABLE_NAME)
.withKeySchema(keySchema)
.withAttributeDefinitions(attributeDefinitions)
.withProvisionedThroughput(new ProvisionedThroughput()
.withReadCapacityUnits(5L)
.withWriteCapacityUnits(6L));
Table table = dynamoDB.createTable(request);
table.waitForActive();
However, the "real" tables are deployed to AWS via a cloudformation template (in YAML).
My question is: Is there any way to use that template on tests? I mean, import and create those tables from it and not with the code above?
Maybe via an AWS CLI command or some library that I could use to read the YAML file, and create the tables used for testing based on the template.
Searched a lot about this, and can't find anything in Java.
Thanks in advance.
Your question - "that I could use to read the YAML file, and create the tables "
To read YAML from Java, you would need to use a supported API - for example:
https://dzone.com/articles/read-yaml-in-java-with-jackson
Once you are able to read YAML, you can use the Java SDK for Java (preferably V2) to interact with the DynamoDB service.

HSQLDB data storage as file in Spring Boot Application, unable to visualize

I am trying my hands on spring boot application. I planned to use HSQLDB for the database.
Purpose: Create User Table, Insert, Update, Delete data
I created User Entity, User dao, and saved user data in the user entity.
Everything is working fine.
What I want is to see the data in the table as we can see it for MySQL.
I tried to use razorSQL, Dbeaver but I can't see tables.
application.properties
spring.jpa.hibernate.ddl-auto: update
spring.jpa.hibernate.dialect=org.hibernate.dialect.HSQLDialect
spring.jpa.database: HSQL
spring.jpa.show-sql: true
spring.hsql.console.enabled: true
spring.datasource.url=jdbc:hsqldb:file:data/mydb
spring.datasource.username=sa
spring.datasource.password=
spring.datasource.driverClassName=org.hsqldb.jdbcDriver
I can see the User table data in the browser:
Files created in the data folder:
I have googled a lot but nothing helps.
Questions:
Can we see the data stored in HSQLDB(when running) as we can see for the MySQL in PHPMyAdmin?
In which file data is stored, I have seen that in the script file it saves all the statements (insert, delete etc). Do we have a separate file to store the data?
what is the use of tmp folder created?
Let me know if you need more details. I need to be clear on this. It has taken a lot of time still I am not satisfied
I am able to visualize the data of hsqldb with the help of hsqldb.jar
Assuming
Database folder named "data" is created which contains files with mydb.log, mydb.properties, mydb.script, mydb.tmp
Steps to visualize when using it as fileDb.
1. Download HSQLDB jar.
2. Extract in the folder where we have "data" folder(it contains database files) database files generated ("data" folder).
3. Now we are in the folder, where database folder is created. Run this command "java -cp hsqldb-2.4.1/hsqldb/lib/hsqldb.jar org.hsqldb.util.DatabaseManagerSwing" here "hsqldb-2.4.1" is the downloaded hsqldb folder. It will open up a UI.
4. In this UI, make a new connection, select type as "HSQL Database Engine Standalone" put URL as "jdbc:hsqldb:file:data/mydb" (here data is the folder and mydb is the DB name), give user and password as defined in application properties, then say ok. It should connect. (Maken sure the path to the file DB is relative to the folder from where we opened the UI)
Let me know in case anyone is getting errors
You can run HSQLDB as a Server and connect to it simultaneously from your Spring app and from a database utility such as dBeaver. The connection URL will look like jdbc:hsqldb:hsql://localhost/mydb. This is very similar to the way MySQL is used.
Detailed coverage is here: http://hsqldb.org/doc/guide/listeners-chapt.html but see the introduction to the Guide first. You can also consult various step-by-step HSQLDB tutorials on the web.

JDBC with HBase?

As I want to store data on HDFS, so need to access the HBase, so how could I connect to HBase using Java APIs.
Please suggest.
Thanks.
HBase has Java API. Have a look at http://hbase.apache.org/apidocs/index.html
Two important classes are
1) HBaseAdmin
2) HTable
HBaseAdmin is admin API used to create/delete/alter tables
HTable is the client API used to put/get/scan records.
I write a simple framework to operate on hbase.
https://github.com/zhang-xzhi/simplehbase/
Simplehbase is a lightweight ORM framework between java app and hbase.
The main feature of it are following:
data type mapping: mapping java type to hbase's bytes back and forth.
hbase operation wrapping: warpping hbase's put get scan operation to simple java interface.
hbase query language: using hbase filter, simplehbase can use sql-like style to operate on hbase.
dynamic query: like myibatis, simplehbase can use xml config file to define dynamic query to operate on hbase.
insert update support: provide insert, update on top of checkAndPut.
multiple version support: provide interface to operation on hbase's multiple version.
hbase batch operation support.
hbase native interface support.
HTablePool management.
HTable count and sum.
Use HBase as source using using TableMapper class and store in hdfs

Creating a database changelog xml file from an existing database (including stored procs) using Liquibase

Is it possible to create an initial database changelog xml file from the existing state of the database?
I believe I've generated the schema using generateChangeLog, but it doesn't seem to return the stored procedures (or the data).
I'm using SQL Sever 2008
You can return the data using a diffTypes flag that includes "DATA". See http://www.liquibase.org/documentation/diff.html.
Liquibase cannot currently output stored procedures, however. For that you will need to use a different tool and include them in the generated changelog using the tag.
download liquibase.jar , database driver and save to one directory (Ex:/home/mySystem/liquibase), in the command line change the directory to (/home/mySystem/liquibase) and run the below mentioned command
java -jar liquibase.jar --driver=org.postgresql.Driver --classpath=postgresql-42.1.3.jar --changeLogFile=db.changelog.xml --url="jdbc:postgresql://localhost:5432/<databasename>" --username=<username> --password=<password> update
Reference link:
http://www.liquibase.org/documentation/generating_changelogs.html
I just yesterday discovered SQL Power Architect, which seems to be able to generate Liquibase configurations:
http://www.sqlpower.ca/page/architect
For more info on this combination see this blog post:
http://blog.mgm-tp.com/2010/11/data-modeling-part2/

Categories