We will have to update our databases version Oracle v11 to v19.
The database part itself doesn't matter for this discussion, because it will be made by our DBA's.
Besides having to upgrade the JDBC connectors' version and maybe changing how they are instantiated, what are the possible problems and refactors we might have to do?
The applications are in JAVA.
Related
I am having a rather trivial question. In our project we are using Oracle 10g as the database and Java 1.8 as the language. Can anyone please suggest me which JDBC driver will be suitable in this case?
In the Oracle documentation in the below link:
http://www.oracle.com/technetwork/apps-tech/jdbc-10201-088211.html
I found that ojdbc14.jar should be used. However it is written that it is classes compiled using JDK 1.4 and 1.5. So should I use the same driver when using JDK 1.8?
You can try using ojdbc14.jar however it won't support methods introduced in JDBC 4 (Java 6), JDBC 4.1 (Java 7) and JDBC 4.2 (Java 8). Because Java emphasizes backwards compatibility you most likely will be able to open the database connection, however:
The code will throw LinkageError in runtime if you use methods introduced in JDBC 4+. The code will compile with JDBC 4.2 (Java 8) but these methods will not be available during runtime.
You may experience weird behavior with low level features e.g. statement caching and row fetching was modified in 11g ojdbc6.jar few times. I'm not even sure how this features work in the old ojdbc14.jar.
It's a very unusual, an I'd say not advisable, setup that you want to test. The only way to see if it works is to try it.
i typed a single program to connect to a access database and display a single record.
but when i run the program, there seems to be an issue with the drivers.
(i'm just getting started in java)
** the DNS & table name are correct
The JDBC-ODBC bridge driver has been removed from JDK 8. Any tutorials you see that use Access probably depend on this class. It's been part of Java for 20 years, so I'm sure you'll find lots of examples that are now obsolete.
You'll either have to buy a JDBC driver for Access or use a real database like MySQL, PostgreSQL, etc.
I've lately upgraded my Cassandra nodes from version 1.0.11 to version 2.1.8.
Version 1.0.11 uses Hector Java Driver, which isn't supported anymore, whereas for v.2.1.8, we are using Native CQL Driver.
When I've added the libraries for the new version, I've encountered a serious problem with Guava library:
Hector Driver - works with (legacy) guava-r09.jar.
Native CQL Driver - works with guava-18.0.jar.
So, I tried to remove the old Guava jar (v.09) and replace it with the new one (v.18), but it's not compliant with Hector, since Hector is pretty much deprecated for a while and is not compatible with newer versions of anything.
On the other hand, Native CQL driver won't work with Guava versions lower than v.18.
And of course, both together will crash the whole system since many functions will override each other.
Some possible solutions I've been thinking about are:
Migrating our whole project to new Driver - this is a solution but it will take ages, since we've got a lot of complex code.
Deep digging into Hector Classes and manually changing them to be compatible with v.18 - This is another solution, but it will take time and be very risky. Besides, each Guava update I'll need to do modifications which may possibly take much time.
Two parallel apps? Can't really figure this one...
Does anyone have any other easier suggestions? Like some sort of selective library version for one query and a different version for others? Maybe some sort of Docker? or perhaps a Virtual Env solution?
Thanks in advance,
Adam.
Is there a Java tool that can convert mysql dump into postgresql dump available.
Googling got me this, https://github.com/maxlapshin/mysql2postgres. Which is a ruby gem.
In my current development environment installing Ruby is not allowed.
The versions used
Mysql 5.1
Postgres 8.2
Note: mysqldump --compatible=postgresql didnt work!
Thanks.
First, PostgreSQL 8.2 is ancient and unsupported. Upgrade urgently. Read the release notes for each .0 version to find out about any compatibility issues you may face.
As for the conversion, you should generally do it in two phases. Convert and load the schema, then convert and load the data.
Generally automated tools won't do a good job converting database schemas. You should do a schema-only dump, run a conversion tool over it then hand-edit and hand-check it before loading it into PostgreSQL.
Once you have a schema that looks sane, do a data-only dump from MySQL and try loading that into a PostgreSQL instance with your converted schema loaded in it. mysqldump --compatible=postgresql may do a better job, though you'll probably need additional flags too.
If you try it and still don't have any luck, consider following up with more detail. Report exact error messages not just "doesn't work" if you follow up.
Consider downloading the advanced server and use the built-in migration toolkit.
However, you should as Craig said - upgrade to a supported version of postgresql.
I have got an application which uses Java SE 5 and Hibernate 2.5. I have to upgrade / migrate it to Java 6 and a newer version of Hibernate. What is the best strategy?
Should I directly upgrade to the newest stable release (at the moment: 3.6), or does it make more sense to just upgrade to 3.0?
Is it a lot of effort to do such a migration?
I have no experience with Hibernate yet, but I already used Toplink JPA 1.0 in projects.
Can you give me some hints? Thank you...
Best regards,
Kai Wähner
Depends. If you'll use Java EE, then don't bother with Hibernate versions and focus on JPA 2.0. Hibernate will be just the implementation for the JPA. In case you really need some Hibernate specific feature, then check what's its version based on your AS implementation.
If you are not using Java EE, then I would go with Hibernate 3.6 if I'm planning to deploy my app in some months, or I would use 3.5 if I plan to deploy the app to production sometime next week.
The basic idea is: use the latest GA at the time you put something in production. This way, you ensure that you'll have a "supportable" version for a long time.
It's also worth mentioning that the Community versions are not always supported by the vendor. In this case, you won't get a support contract from Red Hat for Hibernate 3.5 or 3.6. If you need support, then you'll have to chose whatever versions they officially support.
I say go for the latest stable release as it will contain further improvements/bugfixes and give you more benefit overall. There have many changes since 2.5 so the migration will not be trivial, but later on, when you upgrade to subsequent versions, the migration steps will be smaller and easier.
Take a look at the migration guides to help you.
What is the best strategy?
It depends, if you have to maintain a lot of projects. Only do updates if you're working on a project. In other words it it isn't broken don't fix it. If you are still developing and you face problems in libraries then check more current versions which could have the bug already fixed.