Java SFTP Mule Library Issue : ClassDef Not found - java

I am working on a Java utility that gets files from HDFS to a remote machine. I have some issues in the SFTP library import. I have included org mule transport and the related libraries in the POM and after doing mvn install I can see that the Classes are available in the dependency Path. However when I execute the class I get the error message as follow:
Exception in thread "main" java.lang.NoClassDefFoundError: org/mule/transport/sftp/SftpClient
at xxxx.yyyyyy.sftpFileTransfer.main(sftpFileTransfer.java:17)
Caused by: java.lang.ClassNotFoundException: org.mule.transport.sftp.SftpClient
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
... 1 more
import java.io.BufferedInputStream;
import java.io.IOException;
import java.io.BufferedInputStream;
import java.io.IOException;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FSDataInputStream;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
import org.mule.transport.sftp.SftpClient;
import com.jcraft.jsch.Channel;
import com.jcraft.jsch.ChannelSftp;
import com.jcraft.jsch.ChannelSftp.LsEntry;
import com.jcraft.jsch.JSch;
import com.jcraft.jsch.JSchException;
import com.jcraft.jsch.Session;
import com.jcraft.jsch.SftpATTRS;
import com.jcraft.jsch.SftpException;
import java.io.File;
import java.io.IOException;
import java.io.InputStream;
import java.util.ArrayList;
import java.util.List;
import java.util.Properties;
import java.util.Vector;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FSDataInputStream;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
import org.mule.transport.sftp.SftpClient;
public class sftpFileTransfer {
public static void main(String[] args) throws IOException {
System.out.println("This is for testing");
SftpClient sftpCli = new SftpClient("abcdef");
sftpCli.login("karthick_kb","/home/karthick/.ssh/id_rsa", null);
sftpCli.changeWorkingDirectory("/tmp/");
Configuration conf = new Configuration();
FileSystem fs = FileSystem.get(conf);
FSDataInputStream fsdisPath = null;
String filePath = null;
filePath = "/a/b/c/d/e/f/1.dat.gz";
Path inputPath = new Path(filePath);
fsdisPath = fs.open(inputPath);
BufferedInputStream bis = new BufferedInputStream(fsdisPath);
sftpCli.storeFile(inputPath.getName(), bis);
fsdisPath.close();
}
}
The maven dependencies are as follows
<dependency>
<groupId>com.jcraft</groupId>
<artifactId>jsch</artifactId>
<version>0.1.54</version>
</dependency>
<dependency>
<groupId>org.mule</groupId>
<artifactId>mule-core</artifactId>
<version>3.4.0</version>
</dependency>
<dependency>
<groupId>org.mule.transports</groupId>
<artifactId>mule-transport-sftp</artifactId>
<version>3.4.0</version>
</dependency>
</dependencies>
I am able to see the required SFTP Classes in the maven dependencies What could I be missing.. Any information would be great

Related

How to resolve the error java.lang.NoSuchMethodError: com.amazonaws.SDKGlobalConfiguration.isInRegionOptimizedModeEnabled()Z?

I am trying to upload a file to S3 and getting error like
Target exception: java.lang.NoSuchMethodError:
com.amazonaws.SDKGlobalConfiguration.isInRegionOptimizedModeEnabled()Z
Code is
String accessKey="accesskey";
String secretKey="mysecretkey"
AWSCredentials credentials = new BasicAWSCredentials(accessKey, secretKey);
AmazonS3 conn = new AmazonS3Client(credentials);
In the line of AmazonS3 conn = new AmazonS3Client(credentials); I am getting the target exception.
Totally I imported these many java packages showing below. still getting the same error.
import java.io.File;
import java.io.IOException;
import java.io.ByteArrayInputStream;
import java.io.File;
import java.util.List;
import com.amazonaws.auth.*;
import com.amazonaws.auth.AWSCredentials;
import com.amazonaws.auth.BasicAWSCredentials;
import com.amazonaws.util.StringUtils;
import com.amazonaws.services.s3.AmazonS3;
import com.amazonaws.services.s3.AmazonS3Client;
import com.amazonaws.SDKGlobalConfiguration;
import com.amazonaws.services.s3.model.Bucket;
import com.amazonaws.services.s3.model.CannedAccessControlList;
import com.amazonaws.services.s3.model.GeneratePresignedUrlRequest;
import com.amazonaws.services.s3.model.GetObjectRequest;
import com.amazonaws.services.s3.model.ObjectListing;
import com.amazonaws.services.s3.model.ObjectMetadata;
import com.amazonaws.services.s3.model.S3ObjectSummary;
import com.amazonaws.services.s3.*;
import com.amazonaws.services.s3.AmazonS3ClientBuilder;
import com.amazonaws.services.s3.AmazonS3Builder;
import com.amazonaws.services.s3.AmazonS3Client;
import com.amazonaws.services.s3.AmazonS3ClientBuilder;
import com.amazonaws.ClientConfigurationFactory;
import com.amazonaws.annotation.NotThreadSafe;
import com.amazonaws.annotation.SdkTestInternalApi;
import com.amazonaws.client.AwsSyncClientParams;
import com.amazonaws.internal.SdkFunction;
import com.amazonaws.regions.AwsRegionProvider;
import com.amazonaws.ClientConfiguration;
import com.amazonaws.Protocol;
import com.amazonaws.AmazonS3Client;
import com.amazonaws.AmazonClientException;
import com.amazonaws.AmazonServiceException;
import com.amazonaws.auth.profile.ProfileCredentialsProvider;
import com.amazonaws.services.s3.AmazonS3;
import com.amazonaws.services.s3.AmazonS3Client;
import com.amazonaws.services.s3.model.PutObjectRequest;

not org.apache.hadoop.mapreduce.Mapper

I am writing a mapreduce project.
I want to send an array from mapper to reducer.
But it has an error and I can't fix It.
I import these classes:
import java.io.DataInput;
import java.io.DataOutput;
import java.io.EOFException;
import java.io.IOException;
import java.net.Socket;
import java.util.HashMap;
import java.util.HashSet;
import java.util.Map;
import java.util.Set;
import org.apache.hadoop.conf.Configured;
import java.io.BufferedInputStream;
import java.io.BufferedReader;
import java.util.*;
import java.io.File;
import java.io.FileInputStream;
import java.io.FileOutputStream;
import java.io.FileReader;
import java.io.IOException;
import java.io.InputStream;
import java.io.InputStreamReader;
import java.io.ObjectInputStream;
import java.io.ObjectOutputStream;
import java.io.OutputStream;
import java.util.Iterator;
import hadoop.DENCLUE;
//import javafx.scene.text.Text;
import sun.security.krb5.Config;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.fs.viewfs.Constants;
import org.apache.hadoop.mapred.FileInputFormat;
import org.apache.hadoop.mapred.FileOutputFormat;
import org.apache.hadoop.mapred.JobClient;
import org.apache.hadoop.mapred.JobConf;
import org.apache.hadoop.mapred.MapReduceBase;
import org.apache.hadoop.mapred.Mapper;
import org.apache.hadoop.mapred.OutputCollector;
import org.apache.hadoop.mapred.Reducer;
import org.apache.hadoop.mapred.Reporter;
import org.apache.hadoop.mapred.TextInputFormat;
import org.apache.hadoop.mapred.TextOutputFormat;
//import org.apache.hadoop.mapred.jobcontrol.Job;
import org.apache.hadoop.util.Tool;
import org.apache.hadoop.util.ToolRunner;
import org.omg.CORBA.PUBLIC_MEMBER;
import com.sun.org.apache.bcel.internal.generic.NEW;
import org.apache.hadoop.conf.*;
import org.apache.hadoop.fs.*;
import org.apache.hadoop.io.*;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.lib.input.*;
import org.apache.hadoop.mapreduce.lib.output.*;
import org.apache.hadoop.mapreduce.Mapper.Context;
import org.apache.hadoop.mapreduce.task.JobContextImpl;
import org.apache.hadoop.mapreduce.lib.input.*;
import org.apache.hadoop.mapreduce.lib.output.*;
import org.apache.hadoop.util.*;
import java.io.DataOutput;
import java.io.DataInput;
import java.io.IOException;
This is my Map class:
public static class Mapn extends MapReduceBase implements Mapper<LongWritable, Text, Text, Text> {
#SuppressWarnings("rawtypes")
Context con ;
#SuppressWarnings("unchecked")
public void map(LongWritable key, Text value, OutputCollector< Text,Text >
output, Reporter reporter) throws IOException {
String line = value.toString();
String[] words=line.split(",");
for(String word: words )
{
Text outputKey = new Text(word.toUpperCase().trim());
try {
con.write(outputKey, words);
} catch (InterruptedException e) {
e.printStackTrace();
}
}
This is the job:
public static void main(String[] args) throws Exception {
Configuration c=new Configuration();
String[] files=new GenericOptionsParser(c,args).getRemainingArgs();
Path input=new Path(files[0]);
Path output=new Path(files[1]);
Job j=new Job(c,"wnt");
j.setJarByClass(projectmr.class);
j.setMapperClass(Mapn.class);
j.setReducerClass(Reduce.class);
j.setOutputKeyClass(Text.class);
j.setOutputValueClass(Text.class);
FileInputFormat.addInputPaths(j, input);
FileOutputFormat.setOutputPath(j, output);
System.exit(j.waitForCompletion(true)?0:1);
and this is the error I get:
Exception in thread "main" java.lang.RuntimeException: class hadoop.projectmr$Mapn not org.apache.hadoop.mapreduce.Mapper
at org.apache.hadoop.conf.Configuration.setClass(Configuration.java:1969)
at org.apache.hadoop.mapreduce.Job.setMapperClass(Job.java:891)
at hadoop.projectmr.main(projectmr.java:191)
This is the old, Hadoop 1 API
import org.apache.hadoop.mapred.*;
You should be importing from classes within
org.apache.hadoop.mapreduce.*;
As the error says
not org.apache.hadoop.mapreduce.Mapper
So, basically, you don't need MapReduceBase, and Mapper is a class now, not an interface
So, you now would have
public static class MyMapper extends Mapper<Kin, Vin, Kout, Vout>
Look at the WordCount code

Trying to import vk.core.api in Java

I'm trying to import vk.core.api in Java but when I try to compile it, I get only errors like "error: package vk.core.api does not exist
import vk.core.api.*;"
package main.java.tddt;
import javafx.scene.control.Label;
import main.java.tddt.data.Log;
import main.java.tddt.data.LogList;
import main.java.tddt.data.Timer;
import main.java.tddt.gui.Controller;
import vk.core.api.*;
import vk.core.api.CompilerResult;
import vk.core.api.TestResult;
import vk.core.api.CompileError;
import vk.core.api.CompilerFactory;
import vk.core.api.JavaStringCompiler;
import vk.core.internal.*;
import vk.core.internal.InternalResult;
import java.util.TreeSet;
import javax.xml.bind.JAXBException;
import java.io.File;
import java.time.LocalDateTime;
import java.util.ArrayList;
import java.util.Collections;
import java.util.Collection;
import java.util.List;
You need to add in the dependencies of your build.gradle something like:
compile group: 'de.hhu.stups', name: 'virtual-kata-lib', version: '1.0.0'

WritableWorkbook cannot be resolved to a type even after importing necessary executable JARs

I am trying to create a new excel file using below code. I have added all the necessary JAR files as you can see from import code lines. But I am still seeing the error that "WritableWorkbook cannot be resolved to a type" and "Workbook can not be resolved to a type".
package myPackage;
import org.openqa.selenium.By;
import org.openqa.selenium.WebDriver;
import org.openqa.selenium.ie.InternetExplorerDriver;
import org.openqa.selenium.WebElement;
import java.util.List;
import java.io.File;
import java.io.OutputStream;
import org.apache.poi.xssf.usermodel.XSSFCell;
import org.apache.poi.xssf.usermodel.XSSFRow;
import org.apache.poi.xssf.usermodel.XSSFSheet;
import org.apache.poi.xssf.usermodel.XSSFWorkbook;
import org.apache.poi.ss.usermodel.Workbook;
public class myClass {
public static void main(String [] args)
{
File eFile = new File ("C:\\Users\\bondrea\\AutomationRelatedFiles\\TrialExcel.xlsx");
WritableWorkbook FinalExcel = Workbook.createWorkbook(eFile);
}
}

Exception in thread "AWT-EventQueue-0" java.lang.ClassCastException

I'm trying to connect my java application with sql server 2012, but is giving me this error:
Exception in thread "AWT-EventQueue-0" java.lang.ClassCastException: com.microsoft.sqlserver.jdbc.SQLServerConnection cannot be cast to com.mysql.jdbc.Connection
Can anyone can help me please?
Thank you very much.
Code of connection:
import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.SQLException;
public class CriaConexao {
public static Connection getConexao()throws SQLException{
try{
Class.forName("com.microsoft.sqlserver.jdbc.SQLServerDriver");
String dbURL = "jdbc:sqlserver://BRGDB:1433;database=DB_SQL;IntegratedSecurity=true";
Connection conexao = DriverManager.getConnection(dbURL);
return conexao;
}catch(Exception e) {
e.printStackTrace();
return null;
}
}
}
Error:
Exception in thread "AWT-EventQueue-0" java.lang.ClassCastException: com.microsoft.sqlserver.jdbc.SQLServerConnection cannot be cast to com.mysql.jdbc.Connection
at sensores.forms.jTLoginConsulta.<init>(jTLoginConsulta.java:71)
at sensores.forms.jTLoginConsulta$4.run(jTLoginConsulta.java:448)
at java.awt.event.InvocationEvent.dispatch(InvocationEvent.java:311)
at java.awt.EventQueue.dispatchEventImpl(EventQueue.java:744)
at java.awt.EventQueue.access$400(EventQueue.java:97)
at java.awt.EventQueue$3.run(EventQueue.java:697)
at java.awt.EventQueue$3.run(EventQueue.java:691)
at java.security.AccessController.doPrivileged(Native Method)
at java.security.ProtectionDomain$1.doIntersectionPrivilege(ProtectionDomain.java:75)
at java.awt.EventQueue.dispatchEvent(EventQueue.java:714)
at java.awt.EventDispatchThread.pumpOneEventForFilters(EventDispatchThread.java:201)
at java.awt.EventDispatchThread.pumpEventsForFilter(EventDispatchThread.java:116)
at java.awt.EventDispatchThread.pumpEventsForHierarchy(EventDispatchThread.java:105)
at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:101)
at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:93)
at java.awt.EventDispatchThread.run(EventDispatchThread.java:82)
jTLoginConsulta imports:
import com.mysql.jdbc.Connection;
import com.mysql.jdbc.PreparedStatement;
import java.awt.*;
import java.sql.ResultSet;
import java.sql.SQLException;
import javax.swing.*;
import org.openide.util.Exceptions;
import sensores.basedados.CriaConexao;
import sensores.forms.jTMainMenu;
import com.mysql.jdbc.PreparedStatement;
import java.awt.Color;
import java.awt.Toolkit;
import java.sql.DriverManager;
import java.sql.ResultSet;
import java.sql.SQLException;
import java.sql.Statement;
import javax.swing.JOptionPane;
import java.awt.*;
import java.sql.*;
import java.awt.event.WindowEvent;
import java.io.BufferedReader;
import java.io.File;
import java.io.FileReader;
import java.io.IOException;
import java.io.PrintWriter;
import java.text.DateFormat;
import java.text.SimpleDateFormat;
import java.util.StringTokenizer;
import java.util.TimerTask;
import javax.swing.*;
import sensores.basedados.CriaConexao;
import sensores.logica.Logins;
import java.util.Timer;
import java.util.TimerTask;
import java.util.*;
import java.util.List;
import sensores.logica.Alarmes;
import javax.mail.*;
import javax.mail.internet.*;
import javax.activation.*;
import javax.mail.Address;
import javax.mail.Message.RecipientType;
Line 71 where the error occurs:
68 public jTLoginConsulta() throws SQLException {
69 initComponents();
70 setIcon();
71 conexao=(Connection) CriaConexao.getConexao();
72
73 }
Check the import for this Connection object
Connection conexao = DriverManager.getConnection(dbURL);
it should be java.sql.Connection. It seems you have imported it wrongly com.mysql.jdbc.Connection
EDIT
as per edited question:
conexao=(Connection) CriaConexao.getConexao();
CriaConexo is returning the com.microsoft.sqlserver.jdbc.SQLServerConnection and you are casting it wrongly to com.mysql.jdbc.Connection. Just correct the import to java.sql.Connection and remove the cast as well.
Hope it helps.

Categories