How to use BulkProcessor in version 8 + - java

I'm upgrading from elasticsearch 7 client to 8 and trying to stop using the deprecated RestHighLevelClient. The issue is that in one of the modules i am having BulkProcessor, and i can't figure out how I can use it with the new clients library, since none of them is compatible.
public static Builder builder(Client client, Listener listener, Scheduler flushScheduler, Scheduler retryScheduler, Runnable onClose) {
Objects.requireNonNull(client, "client");
Objects.requireNonNull(listener, "listener");
return new Builder(client::bulk, listener, flushScheduler, retryScheduler, onClose);
}
The builder above expects package org.elasticsearch.client.internal.Client and i don't find any implementation i can use in any of the dependencies below:
<dependency>
<groupId>co.elastic.clients</groupId>
<artifactId>elasticsearch-java</artifactId>
<version>8.3.2</version>
</dependency>
<dependency>
<groupId>org.elasticsearch.client</groupId>
<artifactId>elasticsearch-rest-client</artifactId>
<version>8.3.2</version>
</dependency>
<dependency>
<groupId>org.elasticsearch</groupId>
<artifactId>elasticsearch</artifactId>
<version>8.3.2</version>
</dependency>
Am i missing something?
Thank you!

Related

RestEasyClient IncompatibleClassChangeError (ResteasyProviderFactory.getContextData(java.lang.Class))

When using org.jboss.resteasy:resteasy-client:4.5.9.Final, I'm getting this exception: Caused by: java.lang.IncompatibleClassChangeError: Expected static method 'java.lang.Object org.jboss.resteasy.spi.ResteasyProviderFactory.getContextData(java.lang.Class)'
However, when I use an earlier version, it seems to work fine. (Or at least it works well enough to fool me.)
Here's my simplified pom:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.example</groupId>
<artifactId>example-project</artifactId>
<version>1.0-SNAPSHOT</version>
<properties>
<maven.compiler.source>15</maven.compiler.source>
<maven.compiler.target>15</maven.compiler.target>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<keycloak.version>12.0.4</keycloak.version>
<!-- Seems to work with this version, but not 4.5.9.Final -->
<resteasy.version>3.6.2.Final</resteasy.version>
</properties>
<dependencies>
<dependency>
<groupId>org.keycloak</groupId>
<artifactId>keycloak-admin-client</artifactId>
<version>${keycloak.version}</version>
</dependency>
<dependency>
<groupId>org.jboss.resteasy</groupId>
<artifactId>resteasy-client</artifactId>
<version>${resteasy.version}</version>
</dependency>
<dependency>
<groupId>org.jboss.resteasy</groupId>
<artifactId>resteasy-jackson2-provider</artifactId>
<version>${resteasy.version}</version>
</dependency>
<dependency>
<groupId>org.jboss.resteasy</groupId>
<artifactId>resteasy-multipart-provider</artifactId>
<version>${resteasy.version}</version>
</dependency>
</dependencies>
</project>
And here's my code:
package org.example.keycloak;
import org.jboss.resteasy.client.jaxrs.ResteasyClientBuilder;
import org.keycloak.OAuth2Constants;
import org.keycloak.admin.client.Keycloak;
import org.keycloak.admin.client.KeycloakBuilder;
public class KeycloakClientAuthExample {
public static void main(String[] args) {
Keycloak keycloak = KeycloakBuilder.builder()
.serverUrl("http://localhost:8080/auth")
.grantType(OAuth2Constants.PASSWORD)
.realm("DungeoneersDemo")
.clientId("dungeoneers-data")
.clientSecret("11111111-2222-3333-4444-555555555555")
.username("user")
.password("pass")
.resteasyClient(
// new ResteasyClientBuilderImpl() // <-- For 4.5.9.Final
new ResteasyClientBuilder()
.connectionPoolSize(10).build()
).build();
keycloak.tokenManager().getAccessToken();
AccessTokenResponse atr =
keycloak.tokenManager().getAccessToken();
System.out.println(atr.getToken());
}
}
Seems to work and I get what looks enough to me like a bearer token: eyJhbGciOiJSUzI1NiIsInR5cCIgOiAiSldUIiwia2lkIiA6ICI2bWhhWUQ..etc...
However, when I update my resteasy.version to the a later version✳ (4.5.9.Final), I get an error:
Exception in thread "main" java.lang.RuntimeException: java.lang.RuntimeException: RESTEASY003940: Unable to instantiate MessageBodyReader
at org.jboss.resteasy.plugins.providers.RegisterBuiltin.register(RegisterBuiltin.java:78)
at org.jboss.resteasy.plugins.providers.RegisterBuiltin.getClientInitializedResteasyProviderFactory(RegisterBuiltin.java:54)
at org.jboss.resteasy.client.jaxrs.internal.ResteasyClientBuilderImpl.getProviderFactory(ResteasyClientBuilderImpl.java:372)
at org.jboss.resteasy.client.jaxrs.internal.ResteasyClientBuilderImpl.build(ResteasyClientBuilderImpl.java:390)
at org.sandbox.security.openidc.keycloak.KeycloakClientAuthExample.main(KeycloakClientAuthExample.java:22)
Caused by: java.lang.RuntimeException: RESTEASY003940: Unable to instantiate MessageBodyReader
at org.jboss.resteasy.core.providerfactory.CommonProviders.processProviderContracts(CommonProviders.java:93)
at org.jboss.resteasy.core.providerfactory.ClientHelper.processProviderContracts(ClientHelper.java:104)
at org.jboss.resteasy.core.providerfactory.ResteasyProviderFactoryImpl.processProviderContracts(ResteasyProviderFactoryImpl.java:841)
at org.jboss.resteasy.core.providerfactory.ResteasyProviderFactoryImpl.registerProvider(ResteasyProviderFactoryImpl.java:829)
at org.jboss.resteasy.core.providerfactory.ResteasyProviderFactoryImpl.registerProvider(ResteasyProviderFactoryImpl.java:816)
at org.jboss.resteasy.plugins.providers.RegisterBuiltin.registerProviders(RegisterBuiltin.java:109)
at org.jboss.resteasy.plugins.providers.RegisterBuiltin.register(RegisterBuiltin.java:74)
... 4 more
Caused by: java.lang.RuntimeException: RESTEASY003325: Failed to construct public org.jboss.resteasy.plugins.providers.jaxb.JAXBElementProvider()
at org.jboss.resteasy.core.ConstructorInjectorImpl.constructOutsideRequest(ConstructorInjectorImpl.java:250)
at org.jboss.resteasy.core.ConstructorInjectorImpl.construct(ConstructorInjectorImpl.java:209)
at org.jboss.resteasy.core.providerfactory.Utils.createProviderInstance(Utils.java:102)
at org.jboss.resteasy.core.providerfactory.CommonProviders.processProviderContracts(CommonProviders.java:87)
... 10 more
Caused by: java.lang.IncompatibleClassChangeError: Expected static method 'java.lang.Object org.jboss.resteasy.spi.ResteasyProviderFactory.getContextData(java.lang.Class)'
at org.jboss.resteasy.plugins.providers.jaxb.AbstractJAXBProvider.<init>(AbstractJAXBProvider.java:52)
at org.jboss.resteasy.plugins.providers.jaxb.JAXBElementProvider.<init>(JAXBElementProvider.java:46)
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:64)
at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.base/java.lang.reflect.Constructor.newInstanceWithCaller(Constructor.java:500)
at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:481)
at org.jboss.resteasy.core.ConstructorInjectorImpl.constructOutsideRequest(ConstructorInjectorImpl.java:225)
... 13 more
Process finished with exit code 1
✳ Note: When upgrading to the later version, ResteasyClientBuilder was abstracted and the constructor call needed to be replaced with new ResteasyClientBuilderImpl().
I was typing out my question and, while looking into it one last time before submitting, I found the issue and figured I'd throw it out there just in case someone else has this issue.
It turns out that there is another resteasy dependency that needed to be pulled in because, otherwise, an old version is pulled in. When using the newer version of resteasy-client, the other dependency being pulled in was org.jboss.resteasy:resteasy-jaxb-provider:3.9.1.Final. I added the newer version to my pom (org.jboss.resteasy:resteasy-jaxb-provider:4.5.9.Final) and everything seems to now work.
Not sure why this is the case. I would think if the dependency was being pulled in anyway by resteasy-client, it should've been pulling in the same version.
We were facing the same issue and your reply really helped to find out the mistake. To let other users check the error in detail, further details are:
KC 15.* and KC16.1 uses, in their main pom.xml, resteasy version 3.*
<resteasy.version>3.15.1.Final</resteasy.version>
<resteasy.undertow.version>${resteasy.version}</resteasy.undertow.version>
When you check the version for the container in the server, you can see that, from KC16, the new Widlfly server uses resteasy v4.*
This concept causes a difference in the jars, while it has the behavior to have scope provided (but is not). As you mentioned, the solution to that is the following.
<properties>
<resteasy.version>4.5.9.Final</resteasy.version>
</properties>
<dependencyManagement>
<dependencies>
<dependency>
<groupId>org.jboss.resteasy</groupId>
<artifactId>resteasy-client</artifactId>
<version>${resteasy.version}</version>
</dependency>
<dependency>
<groupId>org.jboss.resteasy</groupId>
<artifactId>resteasy-jackson2-provider</artifactId>
<version>${resteasy.version}</version>
</dependency>
<dependency>
<groupId>org.jboss.resteasy</groupId>
<artifactId>resteasy-multipart-provider</artifactId>
<version>${resteasy.version}</version>
</dependency>
<dependency>
<groupId>org.jboss.resteasy</groupId>
<artifactId>resteasy-jaxb-provider</artifactId>
<version>${resteasy.version}</version>
</dependency>
</dependencies></dependencyManagement>
This will force 3 main API changes to consider:
final ResteasyClient client = new ResteasyClientBuilder()
.disableTrustManager()
.socketTimeout(60, TimeUnit.SECONDS)
.establishConnectionTimeout(10, TimeUnit.SECONDS)
/// will need to change to the new methods:
final ResteasyClient client = new ResteasyClientBuilderImpl()
.disableTrustManager()
.readTimeout(60, TimeUnit.SECONDS)
.connectTimeout(10, TimeUnit.SECONDS)
And also this change:
HttpServletResponse contextData = ResteasyProviderFactory.getContextData(HttpServletResponse.class);
//Change to
ResteasyProviderFactory resteasyProviderFactory = ResteasyProviderFactoryImpl.getInstance();
HttpServletResponse contextData = resteasyProviderFactory.getContextData(HttpServletResponse.class);
You r previous reply help us to understand under the hood the problem, but, anyway, users shall have more info. See the image bellow as a running KC16 container showing all related libs

Producer#initTransactions doesn't work with KafkaContainer

I try to send messages to Kafka with a transaction. So, I use this code:
try (Producer<Void, String> producer = createProducer(kafkaContainerBootstrapServers)) {
producer.initTransactions();
producer.beginTransaction();
Arrays.stream(messages).forEach(
message -> producer.send(new ProducerRecord<>(KAFKA_INPUT_TOPIC, message)));
producer.commitTransaction();
}
...
private static Producer<Void, String> createProducer(String kafkaContainerBootstrapServers) {
return new KafkaProducer<>(
ImmutableMap.of(
ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, kafkaContainerBootstrapServers,
ProducerConfig.CLIENT_ID_CONFIG, UUID.randomUUID().toString(),
ProducerConfig.ENABLE_IDEMPOTENCE_CONFIG, true,
ProducerConfig.TRANSACTIONAL_ID_CONFIG, UUID.randomUUID().toString()
),
new VoidSerializer(),
new StringSerializer());
}
If I use local Kafka, it works well.
But if I use Kafka TestContainers, it freezes on producer.initTransactions():
private static final String KAFKA_VERSION = "4.1.1";
#Rule
public KafkaContainer kafka = new KafkaContainer(KAFKA_VERSION)
.withEmbeddedZookeeper();
How can I configure KafkaContainer to work with transactions?
Try using Kafka for JUnit instead of Kafka testcontainers. I had the same problem with transactions and made them alive in this way.
Maven dependency that I used:
<dependency>
<groupId>net.mguenther.kafka</groupId>
<artifactId>kafka-junit</artifactId>
<version>2.1.0</version>
<scope>test</scope>
</dependency>
I got an exception using Kafka for JUnit as #AntonLitvinenko suggested. My question about it here.
I added this dependency to fix it (see the issue):
<dependency>
<groupId>org.apache.curator</groupId>
<artifactId>curator-test</artifactId>
<version>2.12.0</version>
<exclusions>
<exclusion>
<groupId>org.apache.zookeeper</groupId>
<artifactId>zookeeper</artifactId>
</exclusion>
</exclusions>
<scope>test</scope>
</dependency>
Also, I used 2.0.1 version for kafka-junit and kafka_2.11:
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka_2.11</artifactId>
<version>${kafkaVersion}</version>
<scope>test</scope>
</dependency>

MiniDFSCluster UnsatisfiedLinkError org.apache.hadoop.io.nativeio.NativeIO$Windows.access0

When doing:
new MiniDFSCluster.Builder(config).build()
I get this exception:
java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z
at org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Native Method)
at org.apache.hadoop.io.nativeio.NativeIO$Windows.access(NativeIO.java:557)
at org.apache.hadoop.fs.FileUtil.canWrite(FileUtil.java:996)
at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.analyzeStorage(Storage.java:490)
at org.apache.hadoop.hdfs.server.namenode.FSImage.recoverStorageDirs(FSImage.java:308)
at org.apache.hadoop.hdfs.server.namenode.FSImage.recoverTransitionRead(FSImage.java:202)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFSImage(FSNamesystem.java:1020)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFromDisk(FSNamesystem.java:739)
at org.apache.hadoop.hdfs.server.namenode.NameNode.loadNamesystem(NameNode.java:536)
at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:595)
at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:762)
at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:746)
at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1438)
at org.apache.hadoop.hdfs.MiniDFSCluster.createNameNode(MiniDFSCluster.java:1107)
at org.apache.hadoop.hdfs.MiniDFSCluster.createNameNodesAndSetConf(MiniDFSCluster.java:978)
at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:807)
at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:467)
at org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:426)
I want to use the Hadoop Minicluster to test my Hadoop HDFS (which does not throw this exception, see java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.createDirectoryWithMode0).
In my Maven pom.xml I have these dependencies:
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.6.0</version>
</dependency>
<!-- for unit testing -->
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.6.0</version>
<type>test-jar</type>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-hdfs</artifactId>
<version>2.6.0</version>
</dependency>
<!-- for unit testing -->
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-hdfs</artifactId>
<version>2.6.0</version>
<scope>test</scope>
<classifier>tests</classifier>
</dependency>
I understood, I do not need to the specific 'hadoop-minicluster' dependency as it already comes with the above included hadoop-hdfs.
I am trying to build the MiniDFSCluster in my #BeforeAll.
I have used different configs for the builder:
config = new HdfsConfiguration(); / config = new Configuration();
And different ways to create a path for the baseDir:
config.set(miniDfsClusterHD.HDFS_MINIDFS_BASEDIR, baseDir);
Also, I downloaded the hadoop.dll and hdfs.dll and winuntils.exe in v2.6.0 and added the path to those in my environment variables.
I studied all the related issues I could find in stackoverflow (without success, obviously) and all guides and code examples I could find in the internet (there are a few and they all do it differently.)
Can somehow please help me, I am out of ideas.
UPDATE:
I am running the test with the following VM options (which should not be necessary, I think):
-Dhadoop.home.dir=C:/Temp/hadoop
-Djava.library.path=C:/Temp/hadoop/bin
I also tried to set the environment variables directly (which should not be necessary when using the VM options):
System.setProperty("hadoop.home.dir", "C:\\Temp\\hadoop-2.6.0");
System.setProperty("java.library.path", "C:\\Temp\\hadoop-2.6.0\\bin");
I have resolved this issue by downloading the source file (org.apache.hadoop.io.nativeio.NativeIO.java) and modifying line in
function access (in your case 557) from:
return access0(path, desiredAccess.accessRight());
to
return true;

LongComparator does not work in Google Cloud Bigtable with HBase API

I'm trying to build some filters to filter data from Bigtable. I'm using bigtable-hbase drivers and HBase drivers. Actually here are my dependencies from pom.xml:
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-common</artifactId>
<version>${hbase.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-protocol</artifactId>
<version>${hbase.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-client</artifactId>
<version>${hbase.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-server</artifactId>
<version>${hbase.version}</version>
</dependency>
<dependency>
<groupId>com.google.cloud.bigtable</groupId>
<artifactId>bigtable-hbase</artifactId>
<version>${bigtable.version}</version>
</dependency>
I'm filtering data like this:
Filter filterName = new SingleColumnValueFilter(Bytes.toBytes("FName"), Bytes.toBytes("FName"),
CompareFilter.CompareOp.EQUAL, new RegexStringComparator("JOHN"));
FilterList filters = new FilterList();
filters.addFilter(filterName);
Scan scan1 = new Scan();
scan1.setFilter(filters);
This works ok. But then I add following to previous code:
Filter filterSalary = new SingleColumnValueFilter(Bytes.toBytes("Salary"), Bytes.toBytes("Salary"),
CompareFilter.CompareOp.GREATER_OR_EQUAL, new LongComparator(100000));
filters.addFilter(filterSalary);
and it give me this exception:
Exception in thread "main" com.google.cloud.bigtable.hbase.adapters.filters.UnsupportedFilterException: Unsupported filters encountered: FilterSupportStatus{isSupported=false, reason='ValueFilter must have either a BinaryComparator with any compareOp or a RegexStringComparator with an EQUAL compareOp. Found (LongComparator, GREATER_OR_EQUAL)'}
at com.google.cloud.bigtable.hbase.adapters.filters.FilterAdapter.throwIfUnsupportedFilter(FilterAdapter.java:144)
at com.google.cloud.bigtable.hbase.adapters.ScanAdapter.throwIfUnsupportedScan(ScanAdapter.java:55)
at com.google.cloud.bigtable.hbase.adapters.ScanAdapter.adapt(ScanAdapter.java:91)
at com.google.cloud.bigtable.hbase.adapters.ScanAdapter.adapt(ScanAdapter.java:43)
at com.google.cloud.bigtable.hbase.BigtableTable.getScanner(BigtableTable.java:247)
So my question is how to filter long data type? Is it hbase issue or bigtable specific?
I found this How do you use a custom comparator with SingleColumnValueFilter on HBase? but I can't load my jars to server so it is not applicable for my case.
SingleColumnValueFilter supports the following comparators:
BinaryComparator
BinaryPrefixComparator
RegexStringComparator.
See this link for an up-to-date list:
https://cloud.google.com/bigtable/docs/hbase-differences

how to handle multipart request using http client

Following is my code to handle Multipart using httpclient
if(methodParams.getDataType().length()>0 && methodParams.getDataType().equals("org.springframework.web.multipart.MultipartFile")){
isMultipart = true;
MultipartEntity entity = new MultipartEntity( HttpMultipartMode.BROWSER_COMPATIBLE );
// For usual String parameters
entity.addPart( methodParams.getVariableDefined(), new StringBody("".toString() , "text/plain", Charset.forName( "UTF-8" )));
postURL.setEntity( entity );
}
but i get the following exception :
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.james.mime4j.util.CharsetUtil.getCharset(Ljava/lang/String;)Ljava/nio/charset/Charset;
at org.apache.http.entity.mime.MIME.<clinit>(MIME.java:51)
at org.apache.http.entity.mime.HttpMultipart.<clinit>(HttpMultipart.java:85)
at org.apache.http.entity.mime.MultipartEntity.<init>(MultipartEntity.java:77)
at org.apache.http.entity.mime.MultipartEntity.<init>(MultipartEntity.java:96)
at com.hexgen.tools.HexgenClassUtils.doPOST(HexgenClassUtils.java:151)
at com.hexgen.reflection.HttpClientRequests.handleHTTPRequest(HttpClientRequests.java:74)
at com.hexgen.reflection.HexgenWebAPITest.main(HexgenWebAPITest.java:115)
EDIT:
following are the dependency i use
<dependency>
<groupId>org.apache.httpcomponents</groupId>
<artifactId>httpmime</artifactId>
<version>4.0.1</version>
</dependency>
<dependency>
<groupId>org.apache.httpcomponents</groupId>
<artifactId>httpclient</artifactId>
<version>4.0.1</version>
</dependency>
how to solve this.
You can take a look at dependencies one more time, perhaps you've missed some jars.
You may also replace your old jars with newer version of a httpclient along with httpmime. httpclient is no longer relying on james mime4j since version 4.1.
You may also end up managing your dependencies with maven. Just in case if you are not using it.
Edit:
You may add the following
<dependency>
<groupId>org.apache.james</groupId>
<artifactId>apache-mime4j</artifactId>
<version>0.6</version>
</dependency>

Categories