Problem
When trying to deploy an Apache Beam Pipeline on Google Cloud Platform Dataflow service which connects to a Oracle 11gR2 (11.2.0.4) database to retrieve rows, I received the following error when using the Apache Beam JdbCIO Transform:
Error message from worker: java.lang.RuntimeException: org.apache.beam.sdk.util.UserCodeException: java.sql.SQLException: Cannot create PoolableConnectionFactory (ORA-00604: error occurred at recursive SQL level 1 ORA-01882: timezone region not found )
To solve the problem, I updated the pom.xml
<!--https://mvnrepository.com/artifact/com.oracle.database.jdbc/ojdbc6 -->
<dependency>
<groupId>com.oracle.database.jdbc</groupId>
<artifactId>ojdbc6</artifactId>
<version>11.2.0.4</version>
</dependency>
Related
I'm writing a java streaming pipeline with Apache Beam that reads messages from Google Cloud PubSub and should write them into an ElasticSearch instance. Currently, I'm using the direct runner, but the plan is to deploy the solution on Google Cloud Dataflow.
First of all, I wrote a pipeline that reads from PubSub and writes to text files and it works. Then, I sat up the ElasticSearch instance and also this works. I wrote some documents with curl and it was easy.
Then, when I tried to perform the write with Beam's ElasticSearch connector, I started to get some error. Actually, I get ava.lang.NoSuchMethodError: org.elasticsearch.client.RestClient.performRequest, in spite of the fact that I added the dependency on my pom.xml file.
What I'm doing is essentially this:
messages.apply(
"TwoMinWindow",
Window.into(FixedWindows.of(new Duration(120*1000)))
).apply(
"ElasticWrite",
ElasticsearchIO.write()
.withConnectionConfiguration(
ElasticsearchIO.ConnectionConfiguration
.create(new String[]{"http://xxx.xxx.xxx.xxx:9200"}, "streaming_data", "string")
.withUsername("xxxx")
.withPassword("xxxxxxxx")
)
);
Using the DirectRunner, I'm able to connect to PubSub, but I get an exception when the pipeline tries to connect with the ElasticSearch instance:
java.lang.NoSuchMethodError: org.elasticsearch.client.RestClient.performRequest(Ljava/lang/String;Ljava/lang/String;[Lorg/apache/http/Header;)Lorg/elasticsearch/client/Response;
at org.apache.beam.sdk.util.UserCodeException.wrap (UserCodeException.java:34)
at org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO$Write$WriteFn$DoFnInvoker.invokeSetup (Unknown Source)
at org.apache.beam.sdk.transforms.reflect.DoFnInvokers.tryInvokeSetupFor (DoFnInvokers.java:50)
at org.apache.beam.runners.direct.DoFnLifecycleManager$DeserializingCacheLoader.load (DoFnLifecycleManager.java:104)
at org.apache.beam.runners.direct.DoFnLifecycleManager$DeserializingCacheLoader.load (DoFnLifecycleManager.java:91)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture (LocalCache.java:3528)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.loadSync (LocalCache.java:2277)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad (LocalCache.java:2154)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.get (LocalCache.java:2044)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.get (LocalCache.java:3952)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.getOrLoad (LocalCache.java:3974)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.get (LocalCache.java:4958)
at org.apache.beam.runners.direct.DoFnLifecycleManager.get (DoFnLifecycleManager.java:61)
at org.apache.beam.runners.direct.ParDoEvaluatorFactory.createEvaluator (ParDoEvaluatorFactory.java:129)
at org.apache.beam.runners.direct.ParDoEvaluatorFactory.forApplication (ParDoEvaluatorFactory.java:79)
at org.apache.beam.runners.direct.TransformEvaluatorRegistry.forApplication (TransformEvaluatorRegistry.java:169)
at org.apache.beam.runners.direct.DirectTransformExecutor.run (DirectTransformExecutor.java:117)
at java.util.concurrent.Executors$RunnableAdapter.call (Executors.java:511)
at java.util.concurrent.FutureTask.run (FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker (ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run (ThreadPoolExecutor.java:624)
at java.lang.Thread.run (Thread.java:748)
Caused by: java.lang.NoSuchMethodError: org.elasticsearch.client.RestClient.performRequest(Ljava/lang/String;Ljava/lang/String;[Lorg/apache/http/Header;)Lorg/elasticsearch/client/Response;
at org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.getBackendVersion (ElasticsearchIO.java:1348)
at org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO$Write$WriteFn.setup (ElasticsearchIO.java:1200)
What I added in the pom.xml is :
<dependency>
<groupId>org.apache.beam</groupId>
<artifactId>beam-sdks-java-io-google-cloud-platform</artifactId>
<version>${beam.version}</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.elasticsearch.client/elasticsearch-rest-client -->
<dependency>
<groupId>org.elasticsearch.client</groupId>
<artifactId>elasticsearch-rest-client</artifactId>
<version>${elastic.version}</version>
</dependency>
I'm stuck with this problem and I don't know how to solve it. If I use a JestClient, I'm able to connect to ElasticSearch without any issue.
Have you any suggestion?
You are using a newer version of RestClient that does not have the method performRequest(String, Header). If you look at the latest source code, you can see that the method takes a Request now, whereas in older versions there were methods that took Strings and Headers.
These methods were deprecated and then removed from the code on September 1, 2018.
Either change your code to use the newer Elastic Search library, or specify an older version of the library (it needs to be before 7.0.x, e.g. 6.8.4) that is compatible with your code.
When I add this to the pom.xml:
<!-- https://mvnrepository.com/artifact/us.fatehi/schemacrawler-mariadb -->
<dependency>
<groupId>us.fatehi</groupId>
<artifactId>schemacrawler-mariadb</artifactId>
<version>14.08.06</version>
</dependency>
Then I get an error:
java.util.ServiceConfigurationError: schemacrawler.tools.databaseconnector.DatabaseConnector: Provider schemacrawler.server.mariadb.MariaDBDatabaseConnector could not be instantiated
..
Caused by: java.lang.NoSuchMethodError: schemacrawler.tools.databaseconnector.DatabaseConnector.<init>(Lschemacrawler/tools/databaseconnector/DatabaseServerType;Ljava/lang/String;Ljava/lang/String;Ljava/lang/String;Ljava/lang/String;)V
I am trying to connect to an Oracle database. This works if I omit MariaDb from the pom.
I am using a higher version of SchemaCrawler:
<dependency>
<groupId>us.fatehi</groupId>
<artifactId>schemacrawler</artifactId>
<version>14.21.02</version>
</dependency>
<!-- https://mvnrepository.com/artifact/us.fatehi/schemacrawler-oracle -->
<dependency>
<groupId>us.fatehi</groupId>
<artifactId>schemacrawler-oracle</artifactId>
<version>14.21.02</version>
</dependency>
I would like to have the MariaDB in the pom.xml and still be able to read Oracle with SchemaCrawler. The error occurs after connecting to the database, in the last line of the following code:
Connection dbConnection = DatabaseBroker.getDbConnection(
eventName,
cbDatabase.getValue(),
tConnectionString.getValue(),
tUsername.getValue(),
tPassword.getValue()
);
//Schema schema = SchemaCrawler.getSchema(dbConnection, SchemaInfoLevel.detailed(), new SchemaCrawlerOptions());
//SchemaCrawler sc = new SchemaCrawler(dbConnection, null);
try
{
Catalog catalog = SchemaCrawlerUtility.getCatalog(dbConnection, null);
You are using incompatible versions of the main SchemaCrawler library and a SchemaCrawler database plugin. You do not need a plugin for MariaDB if you are connecting to Oracle. In fact, SchemaCrawler will work with most databases even without a SchemaCrawler database plugin on the classpath.
Have a JDK7 app running on Tomcat and it does have the following env settings:
-Dhttps.protocols=TLSv1.1,TLSv1.2
The above setting ensures that we don't use TLS 1.0 when connecting over HTTPS while making API calls etc.
We also use the org.springframework.mail.javamail.JavaMailSenderImpl class to send outgoing SMTP email, and use these props:
mail.smtp.auth=false;mail.smtp.socketFactory.port=2525;mail.smtp.socketFactory.fallback=true;mail.smtp.starttls.enable=true
The problem is that the connection to the SMTP email server is failing when it's upgraded to TLS1.2.
javax.net.ssl.SSLHandshakeException: Remote host closed connection
during handshake
Is there a settings or code change that will force the TLS1.2 protocol?
I did some searching and it looks like these env settings are only for applet and web clients, not for server side apps
-Ddeployment.security.SSLv2Hello=false -Ddeployment.security.SSLv3=false -Ddeployment.security.TLSv1=false
This is the fix for the next guy looking:
mail.smtp.starttls.enable=true;
mail.smtp.ssl.protocols=TLSv1.2;
It didn't work for me in one pretty old app and I couldn't realize why. After some research I found that the javax.mail version in the app dependencies was 1.4. You must upgrade to at least 1.5.
I needed both Vojtech Zavrel and Sunny's answer in my case. I was running Java 1.8 Spring Boot 1.2.5 and running on Big Sur 11.2.3 and spring version 4.2.1.RELEASE.
After I updated my dependency like this
<dependency>
<groupId>javax.mail</groupId>
<artifactId>mail</artifactId>
<version>1.5.0-b01</version>
</dependency>
and I updated my JavaMailSenderImpl with
Properties prop = new Properties();
prop.setProperty("mail.smtp.auth", "true");
prop.setProperty("mail.smtp.starttls.enable", "true");
prop.setProperty("mail.smtp.ssl.protocols", "TLSv1.2"); // Added this line
prop.setProperty("mail.smtp.ssl.trust", mailUri.getHost());
mailSender.setJavaMailProperties(prop);
I saw the Received fatal alert: protocol_version error resolve.
An update to the most recent version (1.6.2.) of Java Mail also fixes the issue. In my case I upgraded from:
<dependency>
<groupId>javax.mail</groupId>
<artifactId>mail</artifactId>
<version>1.5.0-b01</version>
</dependency>
to:
<dependency>
<groupId>com.sun.mail</groupId>
<artifactId>javax.mail</artifactId>
<version>1.6.2</version>
</dependency>
This fixed the error
javax.mail.AuthenticationFailedException: 421 4.7.66 TLS 1.0 and 1.1 are not supported. Please upgrade/update your client to support TLS 1.2
I was getting from an Outlook SMTP-Server. No property changes needed.
I have a simple rest app on spring. For deploy I've been created 2 profiles - dev & heroku. With dev profile it is all ok. But I can't deploy to Heroku:
[ERROR] Failed to execute goal org.liquibase:liquibase-maven-plugin:3.5.3:update (default) on project myProject: Error setting up or running Liquibase: liquibase.exception.DatabaseException: liquibase.exception.DatabaseException: Connection could not be created to jdbc:postgres://aa2-22-222-222-222.aaaaaa-1.amazonaws.com:5432/aaaaaaaaa with driver org.postgresql.Driver. Possibly the wrong driver for the given database URL -> [Help 1]
I thought that the problem was in my old driver (locally I am using PG 9.4 but on Heroku is 9.6)
<dependency>
<groupId>org.postgresql</groupId>
<artifactId>postgresql</artifactId>
<version>42.1.1</version>
</dependency>
After update the issue was not resolved.
I tried to use connection strings directly and from System.env but the result was same.
How can I fix this?
Your database URL starts with jdbc:postgres:// but should be jdbc:postgresql://.
I recommend using the provided JDBC_DATABASE_URL environment variable instead of parsing the DATABASE_URL yourself:
https://devcenter.heroku.com/articles/connecting-to-relational-databases-on-heroku-with-java#using-the-jdbc_database_url
I'm trying to implement the sample EventHub application given here, but it's giving me errors. I've followed the exact same steps given in the document. I'm on HDInsight 3.5, Storm 1.0.1.2.5.4.0-121
Here's the one for EventHubReader, as seen from the Storm UI.
com.microsoft.eventhubs.client.EventHubException: org.apache.qpid.amqp_1_0.client.ConnectionErrorException: An AMQP error occurred (condition='amqp:unauthorized-access'). TrackingId:53ca4652535f423e5f0049dc08ef9_G22, SystemTracker:gateway2, Timestamp:2/28/2017 7:51:21 AM
at com.microsoft.eventhubs.client.EventHubReceiver.ensureReceiverCreated(EventHubReceiver.java:112) ~[stormjar.jar:?]
at com.microsoft.eventhubs.client.EventHubReceiver.<init>(EventHubReceiver.java:65) ~[stormjar.jar:?]
at com.microsoft.eventhubs.client.EventHubConsumerGroup.createReceiver(EventHubConsumerGroup.java:56) ~[stormjar.jar:?]
at com.microsoft.eventhubs.client.ResilientEventHubReceiver.initialize(ResilientEventHubReceiver.java:63) ~[stormjar.jar:?]
at org.apache.storm.eventhubs.spout.EventHubReceiverImpl.open(EventHubReceiverImpl.java:74) ~[stormjar.jar:?]
...
AMQP error occurred (condition='amqp:unauthorized-access'). TrackingId:53ca4652535f423e825f0049dc08eff9_G22, SystemTracker:gateway2, Timestamp:2/28/2017 7:51:21 AM
at org.apache.qpid.amqp_1_0.client.Receiver.<init>(Receiver.java:223) ~[stormjar.jar:?]
at org.apache.qpid.amqp_1_0.client.Session.createReceiver(Session.java:281) ~[stormjar.jar:?] ... 11 more
EventHubWriter:
com.microsoft.eventhubs.client.EventHubException: An error occurred while sending data.
at com.microsoft.eventhubs.client.EventHubSender.sendCore(EventHubSender.java:93) ~[stormjar.jar:?]
Caused by: org.apache.qpid.amqp_1_0.client.Sender$SenderCreationException: Peer did not create remote endpoint for link, target: my-event-hub
at org.apache.qpid.amqp_1_0.client.Sender.<init>(Sender.java:191) ~[stormjar.jar:?]
pom.xml
<properties>
<storm.version>1.0.1</storm.version>
<hadoop.version>2.7.3</hadoop.version>
</properties>
...
<dependency>
<groupId>com.microsoft</groupId>
<artifactId>eventhubs</artifactId>
<version>1.0.2</version>
</dependency>
I've made sure in my EventHubs.properties file the eventhub connection namespace and policy keys were correct. I've also opened the .jar artifact and made sure the EventHub classes were included.
Does anyone know how to get it to work?
Answering my own question in case anyone else runs into the same problem. It turns out there's a bug with the storm eventhubs library.
https://issues.apache.org/jira/browse/STORM-2371?jql=project%20=%20STORM%20AND%20component%20=%20storm-eventhubs%20AND%20resolution%20=%20Unresolved%20ORDER%20BY%20priority%20DESC,%20key%20DESC