HbaseSink Flume Exception - java

Following is my flume Sink code to split event and store in Hbase,It gives me error when it takes null event
public class MyHbaseEventSerializer implements HbaseEventSerializer {
#Override
public void configure(Context context){}
#Override
public void initialize(Event event, byte[] columnFamily) {
this.payload = event.getBody();
this.cf = columnFamily;
this.e = event;
}
#Override
public List<Row> getActions() throws FlumeException {
List<Row> actions = Lists.newArrayList();
try{
// here splitting event and store in Hbase.
}catch(Exception e){
throw new FlumeException("Could not get row key!", e);
}
return actions
}
#Override
public List<Increment> getIncrements() {
List<Increment> incs = new LinkedList<Increment>();
}
#Override
public void close() {}
}
It Continuous infinite with this error
ERROR : [SinkRunner-PollingRunner-DefaultSinkProcessor] (org.apache.flume.SinkRunner$PollingRunner.run:160) - Unable to deliver event. Exception follows.
java.lang.IllegalStateException: begin() called when transaction is OPEN!
at org.apache.flume.channel.BasicTransactionSemantics.begin(BasicTransactionSemantics.java:131)
at org.apache.flume.sink.hbase.HBaseSink.process(HBaseSink.java:234)
at org.apache.flume.sink.DefaultSinkProcessor.process(DefaultSinkProcessor.java:68)
at org.apache.flume.SinkRunner$PollingRunner.run(SinkRunner.java:147)
at java.lang.Thread.run(Thread.java:724)
Has any one solution to resolve this
Thanks in advance..

Related

Unable to create Kafka Redis Sink with Single Message Transformations

I am trying to create a Kafka Redis sink that deletes a particular key in Redis. One of the ways is to create a Record or Message in Kafka with a specific key and Value as null. But as per the use case, generating the keys is not possible. As a workaround, I wrote a Single message transformer that takes the message from Kafka, sets a particular Key, and sets Value equals null.
Here are my Kafka Connect Confgurations
"connector.class": "com.github.jcustenborder.kafka.connect.redis.RedisSinkConnector",
"transforms.invalidaterediskeys.type": "com.github.cjmatta.kafka.connect.smt.InvalidateRedisKeys",
"redis.database": "0",
"redis.client.mode": "Standalone",
"topics": "test_redis_deletion2",
"tasks.max": "1",
"redis.hosts": "REDIS-HOST",
"key.converter": "org.apache.kafka.connect.storage.StringConverter",
"transforms": "invalidaterediskeys"
}
Here is the code for the transformations :
public class InvalidateRedisKeys<R extends ConnectRecord<R>> implements Transformation<R> {
private static final Logger LOG = LoggerFactory.getLogger(InvalidateRedisKeys.class);
private static final ObjectMapper mapper = new ObjectMapper()
.configure(DeserializationConfig.Feature.FAIL_ON_UNKNOWN_PROPERTIES, false);
#Override
public ConfigDef config() {
return new ConfigDef();
}
#Override
public void configure(Map<String, ?> settings) {
}
#Override
public void close() {
}
#Override
public R apply(R r) {
try {
return r.newRecord(
r.topic(),
r.kafkaPartition(),
Schema.STRING_SCHEMA,
getKey(r.value()),
null,
null,
r.timestamp()
);
} catch (IOException e) {
LOG.error("a.jsonhandling.{}", e.getMessage());
return null;
} catch (Exception e) {
LOG.error("a.exception.{}", e.getMessage());
return null;
}
}
private String getKey(Object value) throws IOException {
A a = mapper.readValue(value.toString(), A.class);
long userId = a.getUser_id();
int roundId = a.getRound_id();
return KeyGeneratorUtil.getKey(userId, roundId);
}
}
where A is
public class A {
private long user_id;
private int round_id;
}
And KeyGeneratorUtil contains a static function that generates a relevant string and sends the results.
I took help from
https://github.com/cjmatta/kafka-connect-insert-uuid
https://github.com/apache/kafka/tree/trunk/connect/transforms/src/main/java/org/apache/kafka/connect/transforms
When I try to initialize Kafka Connect, it says invalid Configurations. Is there something that I am missing?

Inserting an object from REST API to Kafka using Kafka Connect API

I have some issues developing a Kafka source connector using Kafka Connect API.
I get data from a REST API using Retrofit and GSON and then try to insert it into the Kafka.
Here is my source task class:
public class BitfinexSourceTask extends SourceTask implements BitfinexTickerGetter.OnTickerReadyListener {
private static final String DATETIME_FIELD = "datetime";
private BitfinexService service;
private ScheduledExecutorService scheduler = Executors.newScheduledThreadPool(1);
private BlockingQueue<SourceRecord> queue = null;
private BitfinexTickerGetter tickerGetter;
private final Runnable runnable = new Runnable() {
#Override
public void run() {
try {
tickerGetter.get();
} catch (IOException e) {
e.printStackTrace();
}
}
};
private ScheduledFuture<?> scheduledFuture;
#Override
public String version() {
return VersionUtil.getVersion();
}
#Override
public void start(Map<String, String> map) {
service = BitfinexServiceFactory.create();
queue = new LinkedBlockingQueue<>();
tickerGetter = new BitfinexTickerGetter(service, this);
scheduledFuture = scheduler.scheduleAtFixedRate(runnable, 0, 5, TimeUnit.MINUTES);
}
#Override
public List<SourceRecord> poll() throws InterruptedException {
List<SourceRecord> result = new LinkedList<>();
if (queue.isEmpty()) result.add(queue.take());
queue.drainTo(result);
return result;
}
#Override
public void stop() {
scheduledFuture.cancel(true);
scheduler.shutdown();
}
#Override
public void onTickerReady(Ticker ticker) {
Map<String, ?> srcOffset = Collections.singletonMap(DATETIME_FIELD, ticker.getDatetime());
Map<String, ?> srcPartition = Collections.singletonMap("from", "bitfinex");
SourceRecord record = new SourceRecord(srcPartition, srcOffset, ticker.getSymbol(), Schema.STRING_SCHEMA, ticker.getDatetime(), Ticker.SCHEMA, ticker);
queue.offer(record);
}
}
I actually was able to build and add the connector. It runs without any errors or something, but the topic was not created. I have decided to create the topic manually and then re-run the connector, but the topic remained empty. Ticker is my POJO object containing string and double fields.
Can someone help me with this?

Whats the Best Practice to call a method out of a Callback-Response?

I'm using an asyncronus XML-RPC-Client (https://github.com/gturri/aXMLRPC) in my Project and wrote some methods using the asyncronous Callback-Methods of this Client like this this:
public void xmlRpcMethod(final Object callbackSync) {
XMLRPCCallback listener = new XMLRPCCallback() {
public void onResponse(long id, final Object result) {
// Do something
if (callbackSync != null) {
synchronized (callbackSync) {
callbackSync.notify();
}
}
}
public void onError(long id, final XMLRPCException error) {
// Do something
if (callbackSync != null) {
synchronized (callbackSync) {
callbackSync.notify();
}
}
}
public void onServerError(long id, final XMLRPCServerException error) {
Log.e(TAG, error.getMessage());
if (callbackSync != null) {
synchronized (callbackSync) {
callbackSync.notifyAll();
}
}
}
};
XMLRPCClient client = new XMLRPCClient("<url>");
long id = client.callAsync(listener, "<method>");
}
In other methods I like to call this method (here "xmlRpcMethod") and wait until it finished. I wrote methods like this:
public void testMethod(){
Object sync = new Object();
xmlRpcMethod(sync);
synchronized (sync){
try{
sync.wait();
}catch(Interrupted Exception e){
e.printStackTrace();
}
}
// Do something after xmlRcpFinished
}
But this way of waiting and synchronizing get's ugly when the projects grows larger and I need to wait for many requests to finish.
So is this the only possible / best way? Or does someone knows a better solution?
My first shot to create blocking RPC calls would be:
// Little helper class:
class RPCResult<T>{
private final T result;
private final Exception ex;
private final long id;
public RPCResult( long id, T result, Exception ex ){
// TODO set fields
}
// TODO getters
public boolean hasError(){ return null != this.ex; }
}
public Object xmlRpcMethod() {
final BlockingQueue<RPCResult> pipe = new ArrayBlockingQueue<RPCResult>(1);
XMLRPCCallback listener = new XMLRPCCallback() {
public void onResponse(long id, final Object result) {
// Do something
pipe.put( new RPCResult<Object>(id, result, null) );
}
public void onError(long id, final XMLRPCException error) {
// Do something
pipe.put( new RPCResult<Object>(id, null, error) );
}
public void onServerError(long id, final XMLRPCServerException error) {
Log.e(TAG, error.getMessage());
pipe.put(new RPCResult<Object>(id, null, error));
}
};
XMLRPCClient client = new XMLRPCClient("<url>");
long id = client.callAsync(listener, "<method>");
RPCResult result = pipe.take(); // blocks until there is an element available
// TODO: catch and handle InterruptedException!
if( result.hasError() ) throw result.getError(); // Relay Exceptions - do not swallow them!
return result.getResult();
}
Client:
public void testMethod(){
Object result = xmlRpcMethod(); // blocks until result is available or throws exception
}
Next step would be to make a strongly typed version public T xmlRpcMethod().

Java pipes, why i get the write end dead exception

i'm trying to create a program using pipes that communicate between 2 threads (you can say chat between 2 threads), my problem here is when you write there is no problem, but when you read from the pipe, it throws exception with end dead end. I did a send & receive method but my receive should know the length of string is sent by sender method, i did another receive method in same name without knowing the String length sent.
My code is composed of 3 classes as shown below :
package pipes1;
import java.io.*;
public class Pipe
{
private PipedWriter writer;
private PipedReader reader;
public PipedWriter getWriter()
{
return writer;
}
public PipedReader getReader()
{
return reader;
}
public Pipe()
{
writer = new PipedWriter();
reader = new PipedReader();
}
}
========================================================
package pipes1;
import java.io.IOException;
public class Person
{
private String name; //name of person
private String msg1;
private String msg2;
private Pipe pipe;
public String getMsg1()
{
return msg1;
}
public String getMsg2()
{
return msg2;
}
public Pipe getPipe()
{
return pipe;
}
public String getName()
{
return name;
}
public Person(String name,Pipe pipe,String s1,String s2)
{
this.name = name;
this.msg1 = s1;
this.msg2 = s2;
this.pipe = pipe;
}
public void connection(Person x) throws Throwable
{
pipe.getReader().connect(x.pipe.getWriter());
}
public void closing() throws IOException
{
this.pipe.getReader().close();
this.pipe.getWriter().close();
}
public void send(String m) throws IOException
{
this.pipe.getWriter().write(m);
this.pipe.getWriter().flush();
}
public void recieve() throws IOException
{
int data = this.pipe.getReader().read();
while(data!=-1)
{
System.out.print((char)data);
data = this.pipe.getReader().read();
}
System.out.println("");
}
public void recieve(String m) throws IOException
{
int i = 0;
while(i<m.length())
{
System.out.print((char) this.pipe.getReader().read());
i++;
}
System.out.println("");
}
}
==================================================================
package pipes1;
public class Main
{
public static void main(String[] args) throws Throwable
{
Pipe p1 = new Pipe();
Pipe p2 = new Pipe();
Person alice = new Person("Alice",p1,"recieved,thanks","hi bob");
Person bob = new Person("Bob",p2,"hi alice","recieved, thanks");
alice.connection(bob);
bob.connection(alice);
Thread terminal1 = new Thread(new Runnable()
{
#Override
public void run()
{
try
{
bob.send(bob.getName()+":"+bob.getMsg1());
bob.recieve(alice.getName()+":"+alice.getMsg1());
bob.recieve(alice.getName()+":"+alice.getMsg2());
bob.send(bob.getName()+":"+bob.getMsg2());
bob.send("hi");
bob.send("hi");
}
catch (Throwable e)
{
System.out.println(e.getMessage());
}
}
});
//terminal of a
Thread terminal2 = new Thread(new Runnable()
{
#Override
public void run()
{
try
{
alice.recieve(bob.getName()+":"+bob.getMsg1());
alice.send(alice.getName()+":"+alice.getMsg1());
alice.send(alice.getName()+":"+alice.getMsg2());
alice.recieve(bob.getName()+":"+bob.getMsg2());
alice.recieve();
alice.recieve();
}
catch (Throwable e)
{
System.out.println(e.getMessage());
}
}
});
terminal1.start();
terminal2.start();
}
}
=================================================================
and the result is this :
Bob:hi alice
Alice:recieved,thanks
Alice:hi bob
Bob:recieved, thanks
hihiWrite end dead
A thread that wrote to a pipe ended without closing the pipe, leaving the pipe broken. A subsequent attempt to read from the PipedReader detected this and threw an IOException.
From the javadoc for the method PipedReader.read():
public int read()
throws IOException
...
Throws:
IOException - if the pipe is broken, unconnected, closed, or an I/O error occurs.
From the javadoc for PipedInputStream:
A pipe is said to be broken if a thread that was providing data bytes to the connected piped output stream is no longer alive.
I think you can avoid the error by adding bob.closing() in the first thread. (I haven't tested that.) Each writer thread should really close the pipe to which it's writing.

How to use Netflix ObservableResult and rxJava within Asynchronous mode

I was trying to use netflix observable however I managed to do so only synchronously:
This is how I define the remote call:
#Named
public class BroConsumerService {
..
#HystrixCommand(fallbackMethod = "stubbedMethod")
public Observable<String> executeObservableBro(String name) {
return new ObservableResult<String>() {
#Override
public String invoke() {
return executeRemoteService(name);
}
};
}
private String stubbedMethod(String name) {
return "return stubbed";
}
//here I am actually invoking (and observing this method)
#RequestMapping("/executeObservableBro")
public String executeObservableBro(#RequestParam(value = "name", required = false) String name) throws ExecutionException, InterruptedException {
Observable<String> result= broConsumerService.executeObservableBro(name);
result.subscribe(new Observer<String>() {
#Override
public void onCompleted() {
System.out.println("completed");
}
#Override
public void onError(Throwable e) {
System.out.printf(e.getMessage());
}
#Override
public void onNext(String s) {
System.out.println("on next..");
}
});
}
But that works synchronously. I want to be able to "listen" to the executeObservableBro before I execute it. and each time it's being executed ill get notified.
Example would be highly appreciated.
Thanks,
ray.
you have to provide schedulers in subscribeOn method like:
public static void main(String[] args) throws InterruptedException {
Observable<Integer> observable2 = Observable.create(subscriber->{
try {
Thread.sleep(1000);
} catch (InterruptedException e) {
e.printStackTrace();
}
Arrays.asList(1, 2, 3).forEach((value)-> subscriber.onNext(value));
subscriber.onCompleted();
subscriber.onError(new RuntimeException("error"));
});
System.out.println("Before");
observable2
.subscribeOn(Schedulers.io()).subscribe(
(next) -> log.info("Next element {}", next),
(error) -> log.error("Got exception", error),
() -> log.info("Finished")//on complete
);
System.out.println("After");
//Thread.sleep(5000); //uncomment this to wait for subscriptions, otherwise main will quit
}
Its not async by default :)

Categories